Murmura is a comprehensive Ray-based framework for federated and decentralized machine learning. Built for researchers and developers, it provides production-ready tools for distributed machine learning with advanced privacy guarantees and flexible network topologies.
Murmura is a sophisticated federated learning framework that supports both centralized and fully decentralized learning environments. Built on Ray for distributed computing, it enables researchers to experiment with various network topologies, aggregation strategies, and privacy-preserving techniques across single-node and multi-node clusters.
-
๐๏ธ Ray-Based Distributed Computing
Multi-node cluster support with automatic actor lifecycle management and resource optimization -
๐ Flexible Learning Paradigms
Both centralized federated learning and fully decentralized peer-to-peer learning -
๐ Multiple Network Topologies
Star, ring, complete graph, line, and custom topologies with automatic compatibility validation -
โก Intelligent Resource Management
Automatic eager/lazy dataset loading, CPU/GPU allocation, and placement strategies
-
๐ Comprehensive Differential Privacy
Client-level DP with Opacus integration, RDP privacy accounting, and automatic noise calibration -
๐ก๏ธ Byzantine-Robust Aggregation
Trimmed mean and secure aggregation strategies for adversarial environments -
๐ Privacy Budget Tracking
Real-time privacy budget monitoring across clients and training rounds
-
๐ฆ Unified Dataset Interface
Seamless integration with HuggingFace datasets, PyTorch datasets, and custom data -
๐ฏ Flexible Data Partitioning
IID and non-IID data distribution with Dirichlet and quantity-based partitioning -
๐ค PyTorch Model Integration
Easy integration with existing PyTorch models and automatic DP adaptation
-
๐ Real-Time Training Visualization
Network topology visualization, training progress tracking, and metrics export -
๐ Comprehensive Monitoring
Actor health checks, resource usage tracking, and event-driven architecture
# Install with Poetry
poetry install
# Or with pip
pip install murmura
from murmura.orchestration.learning_process import FederatedLearningProcess
from murmura.orchestration.orchestration_config import OrchestrationConfig
from murmura.aggregation.aggregation_config import AggregationConfig
# Configure federated learning
config = OrchestrationConfig(
num_clients=10,
num_rounds=50,
topology_type="star",
aggregation_config=AggregationConfig(strategy="fedavg")
)
# Run federated learning
process = FederatedLearningProcess(config)
results = process.run()
Explore complete examples in the murmura/examples/
directory:
mnist_example.py
- Basic federated learning with MNISTdp_mnist_example.py
- Differential privacy-enabled federated learningdecentralized_mnist_example.py
- Fully decentralized learning without central serverskin_lesion_example.py
- Medical imaging federated learning
- Learning Processes -
FederatedLearningProcess
andDecentralizedLearningProcess
for different learning paradigms - Cluster Manager - Ray-based distributed computing with multi-node support
- Aggregation Strategies - FedAvg, TrimmedMean, GossipAvg with DP variants
- Network Topologies - Flexible network structures for decentralized learning
- Privacy Framework - Comprehensive differential privacy with Opacus integration
Strategy | Type | Privacy-Enabled | Best For |
---|---|---|---|
FedAvg | Centralized | โ | Standard federated learning |
TrimmedMean | Centralized | โ | Adversarial environments |
GossipAvg | Decentralized | โ | Peer-to-peer networks |
- Star - Central server with spoke clients (federated learning)
- Ring - Circular peer-to-peer communication
- Complete - Full mesh networking (all-to-all)
- Line - Sequential peer communication
- Custom - User-defined adjacency matrices
poetry install
poetry shell
# Run all tests
pytest
# Run with coverage
pytest --cov=murmura tests/
# Run excluding integration tests
pytest -m "not integration"
ruff check # Linting
ruff format # Code formatting
mypy murmura/ # Type checking
- Enhanced Privacy Techniques - Homomorphic encryption and secure multi-party computation
- Advanced Network Simulation - Realistic network conditions and fault injection
- AI Agent Integration - Autonomous learning agents for dynamic environments
- Real-world Deployment Tools - Production deployment and monitoring capabilities
We'd love your help building Murmura.
Start by checking out the issues or submitting a pull request.
Licensed under the GNU GPLv3. See LICENSE for more details.
For questions or feedback, open an issue or email [email protected].
Subscribe to our newsletter to receive updates on Murmura's development and be the first to know about new features and releases. Visit our website for more information.