Skip to content

Cloudslab/murmura

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

Murmura

Coverage badge DOI

Murmura is a comprehensive Ray-based framework for federated and decentralized machine learning. Built for researchers and developers, it provides production-ready tools for distributed machine learning with advanced privacy guarantees and flexible network topologies.

๐ŸŒ What is Murmura?

Murmura is a sophisticated federated learning framework that supports both centralized and fully decentralized learning environments. Built on Ray for distributed computing, it enables researchers to experiment with various network topologies, aggregation strategies, and privacy-preserving techniques across single-node and multi-node clusters.

๐Ÿงฉ Key Features

Core Framework

  • ๐Ÿ—๏ธ Ray-Based Distributed Computing
    Multi-node cluster support with automatic actor lifecycle management and resource optimization

  • ๐Ÿ”„ Flexible Learning Paradigms
    Both centralized federated learning and fully decentralized peer-to-peer learning

  • ๐ŸŒ Multiple Network Topologies
    Star, ring, complete graph, line, and custom topologies with automatic compatibility validation

  • โšก Intelligent Resource Management
    Automatic eager/lazy dataset loading, CPU/GPU allocation, and placement strategies

Privacy & Security

  • ๐Ÿ” Comprehensive Differential Privacy
    Client-level DP with Opacus integration, RDP privacy accounting, and automatic noise calibration

  • ๐Ÿ›ก๏ธ Byzantine-Robust Aggregation
    Trimmed mean and secure aggregation strategies for adversarial environments

  • ๐Ÿ“Š Privacy Budget Tracking
    Real-time privacy budget monitoring across clients and training rounds

Data & Models

  • ๐Ÿ“ฆ Unified Dataset Interface
    Seamless integration with HuggingFace datasets, PyTorch datasets, and custom data

  • ๐ŸŽฏ Flexible Data Partitioning
    IID and non-IID data distribution with Dirichlet and quantity-based partitioning

  • ๐Ÿค– PyTorch Model Integration
    Easy integration with existing PyTorch models and automatic DP adaptation

Monitoring & Visualization

  • ๐Ÿ“ˆ Real-Time Training Visualization
    Network topology visualization, training progress tracking, and metrics export

  • ๐Ÿ” Comprehensive Monitoring
    Actor health checks, resource usage tracking, and event-driven architecture

๐Ÿš€ Quick Start

Installation

# Install with Poetry
poetry install

# Or with pip
pip install murmura

Basic Usage

from murmura.orchestration.learning_process import FederatedLearningProcess
from murmura.orchestration.orchestration_config import OrchestrationConfig
from murmura.aggregation.aggregation_config import AggregationConfig

# Configure federated learning
config = OrchestrationConfig(
    num_clients=10,
    num_rounds=50,
    topology_type="star",
    aggregation_config=AggregationConfig(strategy="fedavg")
)

# Run federated learning
process = FederatedLearningProcess(config)
results = process.run()

Examples

Explore complete examples in the murmura/examples/ directory:

  • mnist_example.py - Basic federated learning with MNIST
  • dp_mnist_example.py - Differential privacy-enabled federated learning
  • decentralized_mnist_example.py - Fully decentralized learning without central server
  • skin_lesion_example.py - Medical imaging federated learning

๐Ÿ—๏ธ Architecture

Core Components

  • Learning Processes - FederatedLearningProcess and DecentralizedLearningProcess for different learning paradigms
  • Cluster Manager - Ray-based distributed computing with multi-node support
  • Aggregation Strategies - FedAvg, TrimmedMean, GossipAvg with DP variants
  • Network Topologies - Flexible network structures for decentralized learning
  • Privacy Framework - Comprehensive differential privacy with Opacus integration

๐Ÿ“Š Supported Aggregation Strategies

Strategy Type Privacy-Enabled Best For
FedAvg Centralized โœ… Standard federated learning
TrimmedMean Centralized โœ… Adversarial environments
GossipAvg Decentralized โœ… Peer-to-peer networks

๐ŸŒ Network Topologies

  • Star - Central server with spoke clients (federated learning)
  • Ring - Circular peer-to-peer communication
  • Complete - Full mesh networking (all-to-all)
  • Line - Sequential peer communication
  • Custom - User-defined adjacency matrices

๐Ÿ› ๏ธ Development

Setup

poetry install
poetry shell

Testing

# Run all tests
pytest

# Run with coverage
pytest --cov=murmura tests/

# Run excluding integration tests
pytest -m "not integration"

Code Quality

ruff check          # Linting
ruff format         # Code formatting
mypy murmura/       # Type checking

๐Ÿ”ฎ Future Roadmap

  • Enhanced Privacy Techniques - Homomorphic encryption and secure multi-party computation
  • Advanced Network Simulation - Realistic network conditions and fault injection
  • AI Agent Integration - Autonomous learning agents for dynamic environments
  • Real-world Deployment Tools - Production deployment and monitoring capabilities

๐Ÿค Contributing

We'd love your help building Murmura.
Start by checking out the issues or submitting a pull request.

๐Ÿ“„ License

Licensed under the GNU GPLv3. See LICENSE for more details.

๐Ÿ“ฌ Contact

For questions or feedback, open an issue or email [email protected].

๐Ÿ“ฐ Stay Updated

Subscribe to our newsletter to receive updates on Murmura's development and be the first to know about new features and releases. Visit our website for more information.