Skip to content

Mechanism-design simulator for crowd labeling: peer-truth scoring incentivizes truthful reports and boosts accuracy over majority vote. Reproducible plots, tests, and a Streamlit demo.

Notifications You must be signed in to change notification settings

biru-codeastromer/peertruth-sim

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PeerTruth-Sim

Simulation of peer-prediction incentives for crowd labels. Agents are truthful, random, or biased. A peer-truth style scoring pays by agreement with a random peer weighted by inverse report frequency. Shows payoff ordering and downstream label accuracy with and without the mechanism.

Quickstart

python -m venv .venv
. .venv/bin/activate    # Windows: .\.venv\Scripts\activate
pip install -r requirements.txt
make reproduce
make plot
make test

What you get

  • Average payoff by agent type under peer-prediction

  • Majority-vote accuracy vs weighted-vote accuracy

  • Sensitivity to truthful accuracy and class prior via Streamlit

About

Mechanism-design simulator for crowd labeling: peer-truth scoring incentivizes truthful reports and boosts accuracy over majority vote. Reproducible plots, tests, and a Streamlit demo.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published