Anthony J. Pennings, PhD

WRITINGS ON DIGITAL ECONOMICS, ENERGY STRATEGIES, AND GLOBAL COMMUNICATIONS

AI and Remote Sensing for Monitoring Landslides and Flooding

Posted on | June 24, 2024 | No Comments

Remarks prepared for the 2024 United Nations Public Service Forum ‘Fostering Innovation Amid Global Challenges: A Public Sector Perspective.’ Songdo Convensia, the Republic of Korea 24 -26 June 2024. Organized by the Ministry of the Interior and Safety (MOIS), Republic of Korea.

Thank you very much for this opportunity to address how science and technology can address important issues of landslides and urban flooding. A few days after I was invited to this conference, a very unfortunate landslide occurred in Papua New Guinea. Fatalities are still being tallied but are likely to be between 700 and 1200 people.

Flooding is also tragic sign of our times, and as climate change has significantly increased the level of precipitation in the atmosphere, it increasingly seems to resemble the turbulance created by a child in a filled bathtub. Some of the worst 2023 flooding occurred in Beijing, the Congo, Greece, Libya, Myanmar, and Pakistan. These floods took 1000s of lives, displaced 10s of 1,000s, and caused 10s of billions in property damage.

As requested, I will talk about Artificial Intelligence (AI) along with a remote sensing model I use in my graduate course, EST 561 Sensing Technologies for Disaster Risk Reduction at the State University of New York, Korea, here in Songdo. The “Seven Processes of Remote Sensing” from the Canada Centre for Remote Sensing (CCRS) Fundamentals of Remote Sensing that provide a useful frame for understanding how AI and remote sensing can work together.[1] Additionally, AI can be implemented at several stages of sensing process. I list the seven processes from the Canada Centre for Remote Sensing (CCRS) Model in this slide and below at: [2]

The “Seven Processes of Remote Sensing” from
the Canada Centre for Remote Sensing (CCRS)

Remote sensing, the detecting and monitoring of physical characteristics of an area by using sensing technologies to measure the reflected and emitted radiation at a distance, generates vast amounts of data. This data needs to be accurately collected, categorized, and interpreted for information that can be used by first responders and other decision-makers, including policy-makers.

AI algorithms, particularly those involving machine learning (ML) and deep learning (DL) can automate the extraction and use of remote sensing data, such as identifying water bodies, soil moisture levels, vegetation health, and ground deformations. This can speed up analysis and increase accuracy in crucial situations. Just as AI has proven to be extremely useful in detecting cancerous cells, AI is increasingly able to interpret complex geographical and hydrological imagery. [3]

The primary sensing model involves an energy source (If passive, Sun, Moon, thermal/animal energy; if active, radio, laser, radar). LiDar can measure ground height, for example); The platform for emitting or receiving the energy (space-based, airborne, or mobile); The interaction of energy with the atmosphere (This may include clouds, smoke, dust, rain, fog, snow, steam); and the interaction of energy with the target (What is absorbed? reflected? scattered?).

The Energy Source (A)

Sensing technologies rely on data that comes from an energy source that is either passive or active.
AI can analyze data from passive sources like sunlight or moonlight reflected off the Earth’s surface. For example, it can use satellite imagery from reflected sunlight to detect changes in land and water surfaces that may indicate flooding or landslides. AI can also process data from active sources such as LiDAR, radar, and lasers. Light Detection and Ranging (LiDAR), for example, which uses light instead of radiowaves, can measure variations in ground height with high precision, helping to identify terrain changes that may precede a landslide and also measure the mass of land that may have shifted in the event.

Synthetic Aperture Radar (SAR) sensors on satellites, such as Sentinel-1 and NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS), produce 8–12 GHz and 4–8 GHz electromagnetic waves that can penetrate cloud cover and provide high-resolution images of the Earth’s surface, making it possible to detect and map flooded areas even during heavy rains or at night. Also, high-resolution optical satellites like Landsat and Sentinel-2 capture passive visible and infrared imagery that AI can use to delineate flood boundaries by identifying water bodies and saturated soils.

Interaction of Energy with the Atmosphere (B)

Sensing from satellites in Earth orbit (and other space-based platforms) is highly structured by what’s in the atmosphere.[4] AI can adjust for atmospheric conditions such as clouds, smoke, dust, rain, fog, snow, and steam when analyzing remote sensing data. Machine learning algorithms can be trained to recognize and compensate for these factors, improving the accuracy of flood and landslide detection.

Machine learning models can also simulate how different atmospheric conditions affect radiation, helping to better understand and interpret the data received during various weather scenarios, which is crucial for accurate flood and landslide detection.

Interaction of Energy with the Target (C)

AI can analyze how different surfaces absorb, reflect, or scatter energy. For example, water bodies have distinct reflective properties compared to dry land, which AI can detect to identify flooding. For example “water loves red,” meaning that it absorbs the red rays and reflects the blue, giving us our beautiful blue oceans. Often particulate material absorbs the blue rays and results in the greenish waters. Similarly, AI can identify subtle changes in vegetation or soil moisture that might indicate a landslide. Researchers in Japan are acutely aware of these possibilities given the often mountainous terrain and frequency of heavy rains.[5]

Water and vegetation may reflect similarly in the visible wavelengths but are almost always separable in the infrared. You can see that about 0.7 micrometers (µm), or microns wavelengths, the reflectance starts to vary considerable. (See image below) Spectral response can be quite variable, even for the same target type, and can also vary with time (e.g. “green-ness” of leaves) and location.[6]

Knowing where to “look” spectrally and understanding the factors which influence the spectral response of the features of interest are critical to correctly interpreting the interaction of electromagnetic radiation with the surface.

The Platform and Recording of Energy by the Sensor (D)

Platforms can be space-based, airborne, or mobile. After the energy has been scattered by, or emitted from the target, we require a sensor on a platform to collect and record the electromagnetic radiation. Remote sensing systems that measure energy naturally available are called passive sensors. Passive sensors can only be used to detect energy when the naturally occurring energy is available and makes it through the atmosphere.

Active sensors like LiDar mentioned before, provide their own energy source for illumination. These sensors emit radiation that is directed toward the target. The radiation reflected from that target is detected and measured by the sensing platform.

AI can analyze data from platforms like satellites for large-scale monitoring of land and water events. Satellite technology like SAR provide extensive coverage and can track changes over time, making them ideal for detecting floods and landslides. Aircraft and drones equipped with sensors can collect detailed local data, allowing AI to process this data in real-time and provide immediate insights. Ground-based sensors from cell towers, IoT and mobile units such as Boston Dynamics’ SPOT robots can provide continuous monitoring at locations that may not be accessible to other platforms.

AI can integrate data from these platforms for a comprehensive view of an area, such as the identification of landslide-prone areas by soil and vegetation analysis. High-resolution digital elevation models (DEMs) created from LiDAR or photogrammetry help identify areas with steep slopes and other topographic features associated with landslide risk. Multispectral scanning systems that collect data over a variety of different wavelengths and hyperspectral imagery that detect hundreds of very narrow spectral bands throughout the visible, near-infrared, and mid-infrared portions of the electromagnetic spectrum, can detect soil moisture levels and vegetation health, important indicators of landslide susceptibility.

Transmission, Reception, and Processing (E)

Energy recorded by the sensor is transmitted as data to a receiving and processing station and processed into an image (hardcopy and/or digital). Processed images are interpreted, visually and/or digitally or electronically, to extract information about the target. Spy satellites were early adopters of digital imaging technology such as charge-coupled device (CCDs) and CMOS (complementary metal oxide semiconductor) image sensors that now used in your smartphones. CCDs were an early technology that is still used because of their superior image quality and attempts to reduce their energy consumption. CMOS, meanwhile have seen major improvements in their image quality.

CCD CMOS and Digital Sensing Imagery

These technologies both recieve electromagetic energy immediately (unlike film that has to be developed) and convert them into images. In both cases, a photograph is represented and displayed in a digital format by subdividing the image into small equal-sized and shaped areas, called picture elements or pixels, and representing the brightness of each area with a numeric value or digital number.

Interpretation and Analysis (F)

Making sense of information from technologies to understand changes in land formations and water flooding can benefit from a number of analtyical approaches. Monitoring geographical features over time and space by allowing AI to use techniques such as Time-Series Analysis, River and Stream Gauging, Post-Landslide Assessment, and Damage and Impact Assessment, Volume and Area Estimation

Remote sensing data over time by storing data allows for the monitoring of the temporal dynamics of floods, including the rise and fall of water levels and the progression of floodwaters across a landscape. The Landsat archives provide a rich library of imagery dating back to the 1970s that can be used. Remote sensing can supplement ground-based river and stream gauges by providing spatially extensive measurements of water surface elevation and water flow rates.

Likewise having stored information is useful in post-landslide assessment of damage and impacts of the disaster. Post-event imagery helps assess the extent and impact of landslides on infrastructure, roads, and human settlements, aiding in disaster response and rehabilitation efforts. Remote sensing can also quantify the volume of displaced material and the area affected by landslides, which is essential for understanding the scale of the event and planning recovery operations. This often relies on structural geology and the study of faults, folds, synclines and anticlines and lineaments. Understanding geological structures is the key to mapping potential geohazards (e.g. landslides).[p 198] AI can classify areas affected by floods or landslides, using deep learning to recognize patterns and changes in the landscape. Subsequently, AI can use predictive analytics to identify climate and geologic trends and provide forecasts for flood and landslide risks by analyzing historical and real-time data, giving early warnings and insights.

AI Integration and Applications (G)

Techniques such as data fusion can combine remote sensing data from multiple sensors (e.g., optical, radar, LiDAR) with ground-based observations to enhance the overall quality and resolution of the information.

AI applications can analyze real-time data from sensors to detect rising water levels and predict potential flooding areas. Machine learning algorithms can recognize patterns in historical data, improving the prediction models for future flood events. AI systems can automatically generate alerts and warnings for authorities and the public, allowing for timely evacuations and preparations.

AI can analyze topographical data from LiDAR sensing technologis to detect ground movement and changes in terrain that precede landslides. AI can process data from ground-based sensors to monitor soil moisture levels, a critical factor in landslide risk. By learning from past landslide events, AI can predict areas at risk and suggest mitigation measures.

Conclusion

By integrating AI into each of these processes, remote sensing can become more accurate, efficient, and insightful, providing valuable data for a wide range of applications. AI can contribute at each stage of the remote sensing process, the detection, monitoring, and response to floods and landslides can be significantly improved, leading to better disaster risk management and mitigation strategies. Remote sensing technologies, when combined with ground-based river and stream gauges, provide a spatially extensive, and temporally rich dataset for monitoring water surface elevation and flow rates. This combination enhances the accuracy of hydrological models, improves early warning systems, and supports effective water resource management and disaster risk reduction efforts.

Citation APA (7th Edition)

Pennings, A.J. (2024, Oct 27). AI and Remote Sensing for Monitoring Landslides and Flooding. apennings.com https://apennings.com/technologies-of-meaning/the-value-of-science-technology-and-society-studies-sts/

Notes

[1] Canada Centre for Remote Sensing. (n.d.). Fundamentals of Remote Sensing. Retrieved from https://natural-resources.canada.ca/maps-tools-and-publications/satellite-imagery-elevation-data-and-air-photos/tutorial-fundamentals-remote-sensing/introduction/9363
[2] The Canada Centre for Remote Sensing (CCRS) Model:
1. Energy Source or Illumination (A)
2. Radiation and the Atmosphere (B)
3. Interaction with the Target (C)
4. Recording of Energy by the Sensor (D)
5. Transmission, Reception, and Processing (E)
6. Interpretation and Analysis (F)
7. Application (G) – Information extracted from the imagery about the target in order to better understand it, reveal some new information, or assist in solving a particular problem.
[3] Zhang B, Shi H, Wang H. Machine Learning and AI in Cancer Prognosis, Prediction, and Treatment Selection: A Critical Approach. J Multidiscip Healthc. 2023 Jun 26;16:1779-1791. doi: 10.2147/JMDH.S410301. PMID: 37398894; PMCID: PMC10312208.
[4]A good illustration of what atmospheric conditions influence what electromagnetic emissions can be found at: NASA Earthdata. (n.d.). Remote sensing. NASA. Retrieved from https://www.earthdata.nasa.gov/learn/backgrounders/remote-sensing
[5] Asada H, Minagawa T. Impact of Vegetation Differences on Shallow Landslides: A Case Study in Aso, Japan. Water. 2023; 15(18):3193. https://doi.org/10.3390/w15183193
[6] Near-Infrared (NIR) and Short-Wave Infrared (SWIR) ranges of the infrared spectrum are highly effective for sensing water, while NIR (and to some extent the red edge) is better suited for sensing vegetation. These wavelengths are particularly effective for sensing water. Water strongly absorbs infrared radiation in these ranges, making it appear dark in NIR and SWIR imagery. This absorption characteristic allows for the identification and analysis of water bodies, moisture content in soil, and even snow and ice. This can be used for monitoring lakes, rivers, reservoirs, and assessing soil moisture levels for irrigation management. Vegetation strongly reflects NIR light due to the structure of plant leaves. This high reflectance makes NIR ideal for monitoring vegetation health and biomass. Healthy, chlorophyll-rich vegetation reflects more NIR light compared to stressed or diseased plants.
The transition zone between the red and NIR part of the spectrum, known as the “red edge,” is particularly sensitive to changes in plant health and chlorophyll content. A commonly used index that combines red and NIR reflectance to assess vegetation health and coverage. NDVI is calculated as (NIR – Red) / (NIR + Red). Higher NDVI values indicate healthier and more dense vegetation.

© ALL RIGHTS RESERVED



AnthonybwAnthony J. Pennings, PhD is a Professor at the Department of Technology and Society, State University of New York, Korea teaching ICT for sustainable development and engineering economics. From 2002-2012 he was on the faculty of New York University where he taught digital economics and information systems management. He also taught in the Digital Media MBA at St. Edwards University in Austin, Texas, where he lives when not in the Republic of Korea.

Comments

Comments are closed.

  • Referencing this Material

    Copyrights apply to all materials on this blog but fair use conditions allow limited use of ideas and quotations. Please cite the permalinks of the articles/posts.
    Citing a post in APA style would look like:
    Pennings, A. (2015, April 17). Diffusion and the Five Characteristics of Innovation Adoption. Retrieved from https://apennings.com/characteristics-of-digital-media/diffusion-and-the-five-characteristics-of-innovation-adoption/
    MLA style citation would look like: "Diffusion and the Five Characteristics of Innovation Adoption." Anthony J. Pennings, PhD. Web. 18 June 2015. The date would be the day you accessed the information. View the Writing Criteria link at the top of this page to link to an online APA reference manual.

  • About Me

    Professor at State University of New York (SUNY) Korea since 2016. Moved to Austin, Texas in August 2012 to join the Digital Media Management program at St. Edwards University. Spent the previous decade on the faculty at New York University teaching and researching information systems, digital economics, and strategic communications.

    You can reach me at:

    apennings70@gmail.com
    anthony.pennings@sunykorea.ac.kr

    Follow apennings on Twitter

  • About me

  • Writings by Category

  • Flag Counter
  • Pages

  • Calendar

    June 2024
    M T W T F S S
     12
    3456789
    10111213141516
    17181920212223
    24252627282930
  • Disclaimer

    The opinions expressed here do not necessarily reflect the views of my employers, past or present.