Skip to content

Imageomics/naturelab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Smart NatureLab: Bridging Digital and Natural Worlds

Building the future of biodiversity monitoring through multimodal AI and digital ecosystem modeling at The Wilds.

Long-Term Vision: The Wilds Biological Research Field Station

We are developing The Wilds as a living laboratory for technology and nature research, supporting long-term transdisciplinary multimodal projects across diverse research areas:

Exotic Endangered Species Research

  • Endangered ungulates, rhinos, and carnivores
  • Behavior studies and individual identification
  • Land-use pattern analysis

Conservation and Restoration Ecology

  • Invasive species control and native biodiversity monitoring
  • Habitat assessment and management
  • Endangered prairie habitat protection

Photo Credit: The Wilds

Digital Twin Development

Our work centers on developing a comprehensive digital twin of The Wilds — this 10,000-acre former strip mine in southeastern Ohio managed by the Columbus Zoo. This digital ecosystem will serve as a powerful tool for conservation planning, enabling researchers and wildlife managers to:

  • Test conservation strategies in virtual environments before real-world implementation
  • Optimize sensor deployment for maximum ecological insight with minimal animal disturbance
  • Predict ecosystem responses to environmental changes and management decisions
  • Train and validate AI models for biodiversity monitoring at scale

Multimodal AI for Biodiversity

Our research aims to advance the field of multimodal AI for environmental monitoring by creating datasets and models that can:

  • Synthesize information across sensor modalities (visual, acoustic, GPS, satellite, environmental)
  • Enable real-time adaptive sampling that responds to ecological events as they unfold
  • Support autonomous ecosystem monitoring with minimal human intervention
  • Scale conservation efforts through AI-assisted wildlife management

Impact Beyond The Wilds

This work establishes a replicable framework for digital twin development in conservation settings worldwide. Our open datasets and methodologies will enable:

  • Global conservation applications across diverse ecosystems and species
  • Training resources for the next generation of conservation AI models
  • Collaborative research platform connecting ecology, computer science, and conservation communities
  • Evidence-based conservation through data-driven decision making

Summer 2025 Proof-of-Concept

Project Scope

This summer represents our first step toward the digital twin vision. We're conducting intensive fieldwork to collect a multimodal dataset focused on a 100-acre area used by Pere David's deer and other species at The Wilds.

Study Area: Single pasture for proof-of-concept (View on Google Earth)

Timeline:

  • Field Deployment 1: June 2025
  • Field Deployment 2: August 2025

Sensor Network Deployment

Sensor Network Overview

Senor network deployment. Clockwise: satellite, camera trap, AudioMoth bioacoustic monitor, GPS ear-tag Moovement Platform, and LiDAR-equipped fixed-wing drone.

Sensor Type Data Collected Duration Target Species Specs
GPS Ear Tags Hourly location + ID Continuous Pere David's deer Moovement Platform
Quadcopter Drones Behavioral videos 2-3 days, 4 hrs total Zebras, giraffes, onagers, phorses, African wild dogs Parrot Anafi, ModalAI Sentinal
Fixed-Wing Drones Aerial photos + LiDAR Single 2-hour session Landscape + all species LiDAR sensor
Camera Traps Motion-triggered photos/video 1 week continuous All species (exotic + native) Various models including GardePro
Bioacoustic Monitors Continuous audio recording 1 week continuous Birds, insects, ungulate vocalizations AudioMoth devices
Satellite Imagery Landscape monitoring Continuous archive Vegetation + land use patterns
Weather Station Environmental conditions Continuous Context for all other data

Key Research Questions

This summer's fieldwork will specifically address:

  • Sensor modality strengths: Which sensors are most effective for tracking land use, animal interactions, and individual identification?
  • Spatial-temporal overlap: How well do different sensor types capture the same ecological events?
  • Species identification: Can we reliably identify species through vocalizations alone?
  • Behavioral patterns: How do animals use space differently throughout the day and season?
  • Environmental influences: How do weather conditions affect animal behavior and sensor performance?

Technical Innovation

We're testing the ICICLE cyber-infrastructure for autonomous monitoring, including:

  • Autonomous drone navigation and adaptive flight planning
  • Real-time camera trap image analysis and species detection
  • Edge AI deployment for on-site data processing
  • Sensor fusion techniques for multimodal data integration

Collaboration Opportunities

We invite collaborators to help shape this research and maximize its impact:

For AI/ML Researchers

  • Access to unique multimodal training datasets with ground-truth labels
  • Opportunities to develop novel sensor fusion approaches
  • Testing ground for autonomous sampling algorithms
  • Real-world validation of computer vision and bioacoustic models

For Ecologists & Conservation Biologists

  • Biological research questions that can be explored with multimodal data
  • Conservation applications of digital twin technology
  • Habitat management insights from automated monitoring
  • Species behavior analysis across multiple data streams

For Wildlife Managers

  • Practical applications of digital twin technology for conservation planning
  • Cost-effective monitoring strategies using autonomous sensor networks
  • Evidence-based management through continuous ecosystem monitoring
  • Scalable approaches for large-scale conservation efforts

What We Offer

  • Dataset access for research and model development
  • Co-authorship opportunities on resulting publications
  • Collaborative research with interdisciplinary teams
  • Student involvement in cutting-edge conservation technology

Get Involved

Ready to contribute to the future of conservation technology? We welcome:

  • Research collaboration proposals and partnership ideas
  • Biological research questions that could benefit from multimodal data
  • Technical expertise in AI, robotics, and sensor networks
  • Conservation applications and real-world deployment scenarios

Contact: Jenna Kline, [email protected]

Website: NatureLab at The Wilds

Funding: ICICLE (NSF AI Institute for Intelligent Cyberinfrastructure) and Imageomics (NSF HDR Institute: Imageomics: A New Frontier of Biological Information Powered by Knowledge-Guided Machine Learning).

Photo Credit: The Wilds

About

Bridging Digital and Natural Worlds at The Wilds

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published