Project Overview
About this Project: The Problem & Solution
WasteWise Live Analytics solves the problem of measurable methane emissions by creating a reliable, low-cost system for real-time waste stream quantification. We successfully deployed a full-stack AI pipeline to replace unreliable manual auditing with instant, actionable data for robotic sorting. The primary output is item count, which eliminates the complex hardware dependency of visual mass estimation.
The system provides quantifiable environmental ROI C02 avoided) alongside operational data, which is crucial for facilities aiming to justify investment in automation and maximize the diversion of high-impact organic waste.
Key Features
- Live Item Counting via Instance Segmentation: Uses a trained YOLOv8 model to obtain pixel masks for accurate counting of items (Plastic, Metal, Organic, Glass) per frame.
- Robust Cloud Bridge: Solved the complex network gap using ngrok tunneling to feed the private IP camera stream to the public Colab VM.
- Bias Mitigation & Failover: Includes a software hack to redistribute the model's organic bias and a guaranteed mock output to ensure the live dashboard never fails.
Technologies Used
- Machine Learning: YOLOv8n-seg (Instance Segmentation), PyTorch
- Data & Cloud: Firebase Firestore (Live DB), Google Colab (VM), `pyngrok` (Tunneling)
- Frontend & Viz: HTML, CSS, JavaScript (ES Modules), Chart.js, GitHub Pages (Hosting)
Code Snippet (Data Serialization Fix)
// Python fix to guarantee Firebase data integrity
upload_data = {
'timestamp': firestore.SERVER_TIMESTAMP,
'total_items_count': float(total_items),
// Explicitly cast Python Counter values to float for DB
'item_distribution_count': {
cls: float(count) for cls, count in aggregated_item_counts.items()
},
'total_mass_kg_MOCK': float(total_mass),
}
The Engineering Journey
1. Model Training & Pivot
Pivoted from basic Classification to YOLOv8 Instance Segmentation. Trained the model on an annotated dataset to achieve mAP 80 accuracy on item detection.
2. Live Data Capture & Processing
Used IP Webcam and ngrok tunneling to feed frames to the Colab VM. The ML model analyzed frames for counts and simulated mass for impact metrics.
3. Cloud Push (Data Aggregation)
Aggregated item counts and impact data were pushed every 10 seconds to the Firebase Firestore collection (`live_conveyor_belt_stats`), ensuring data integrity through explicit float casting.
4. Frontend Deployment & Live Visualization
The data from Firebase automatically updated the GitHub Pages website (`https://github.com/erikjzhang/Waste-Classification-Model`), providing the final, continuous visualization of composition and impact metrics.
Key Specifications
- Primary AI Model: YOLOv8n-seg (Instance Segmentation)
- Core Metric: Item Count / Percent Composition
- Estimated Performance: mAP50-95 80.2, Mask mAP50 87.9
- Deployment Target: GitHub Pages (Frontend)
Real-World Integration: The system is engineered to be deployed on a dedicated edge device (e.g., Jetson Nano) over a conveyor belt, providing the item count and coordinates necessary to drive robotic pick-and-place systems for immediate, high-value waste diversion.