Accuracy History

Model performance breakdown by confidence level

2025-26 Season

High Confidence (75%+)
75.0%

527 of 703 correct

Medium Confidence (60-74%)
58.1%

182 of 313 correct

Low Confidence (<60%)
48.4%

89 of 184 correct

Notable Upsets
High-confidence predictions that were wrong
Date Matchup Final Score Predicted Winner Model Confidence
Apr 6 DET @ ORL 107 - 123 DET 86.2%
Apr 5 LAL @ DAL 128 - 134 LAL 86.5%
Apr 4 SAS @ DEN 134 - 136 SAS 81.7%
Apr 3 NOP @ SAC 113 - 117 NOP 83.7%
Apr 1 SAC @ TOR 123 - 115 TOR 95.0%
Apr 1 IND @ CHI 145 - 126 CHI 83.8%
Mar 29 MIA @ IND 118 - 135 MIA 92.1%
Mar 28 PHI @ CHA 118 - 114 CHA 90.1%
Mar 27 DAL @ POR 100 - 93 POR 93.0%
Mar 19 LAC @ NOP 99 - 105 LAC 80.7%

How accuracy is measured: A prediction is correct when the predicted winner matches the actual game winner. Confidence levels are based on the model's win probability: High (75%+), Medium (60-74%), and Low (<60%).

Need a Custom Dashboard or Data Solution?

We build data-driven dashboards, predictive models, and analytics tools tailored to your business.

Let's Connect