Meet the researcher behind ClinAssist and follow the 24-month plan from proposal to open-source prototype.
Final-year Computer Science student at UNSW with hands-on experience in machine learning, explainability, and clinical AI. ClinAssist began as a research concept and grew into a mission — to build AI systems that clinicians can actually trust and use. I'm pursuing this as a Master's by Research, sitting at the intersection of rigorous ML and real-world healthcare impact.
"The hardest problem in clinical AI isn't accuracy — it's trust. A model that can't explain itself will never be used, no matter how good it is."
ClinAssist proposal developed. Research direction defined. Initial supervisor outreach underway.
Systematic review of clinical AI and XAI literature. Structured clinician interviews to identify highest-priority decision support needs. TGA regulatory scoping. IRB/ethics approval and dataset access confirmed (MIMIC-IV, PhysioNet).
XGBoost + SHAP pipeline on structured EHR data. Transformer NLP on clinical notes. CNN/ViT imaging pipeline. Unified risk scoring interface prototype.
Human-factors evaluation with ED clinicians. Does SHAP/LIME/attention-based explainability improve decision accuracy and clinician trust vs. no-explanation baseline?
MRes thesis submission. Two target peer-reviewed publications: one on model architecture, one on clinical validation methodology. Open-source ClinAssist prototype released for further research and clinical piloting. Target venues: JAMIA, npj Digital Medicine.
Whether you're a clinician, researcher, or health institution interested in collaborating — we'd love to hear from you.