
CompliFox
About the project
I built this during the Cambridge University EdTech Hackathon. The brief we focused on was: AI is being rolled out in education, but every region has different—and constantly changing—rules. Schools, regulators, and EdTech vendors all struggle to understand whether an AI feature is actually compliant. In the hackathon, we re-framed this as: "How can organisations make AI compliant across regions when every role is dealing with unclear, conflicting regulations?" That's what inspired CompliFox: an AI-powered compliance co-pilot for education AI tools. It lets you paste lesson content, student data flows, product copy, or model prompts into a scanner. Behind the scenes, a dual-track engine evaluates it against legal frameworks (GDPR / UK GDPR, PIPL, COPPA/CCPA, etc.) and ethical principles (transparency, fairness, safety, human-centricity, trust, sustainability, etc.). The app returns a clear risk view: legal vs ethical risk level, which articles or principles are triggered, and concrete "next step" recommendations (e.g. add parental consent, reduce data retention, improve transparency copy). There's also a simple dashboard and history view so compliance officers and school leaders can compare scans over time and export evidence. So far this is a hackathon prototype rather than a commercial product, but the reaction during the Cambridge event was encouraging—mentors immediately started talking about using it for pre-launch checks of AI features and vendor due-diligence for schools. The biggest "wow" moment for me while building on Floot was how fast I could move from an idea to a working, branded product. Using Vibe Coding, I went from a blank space to a full landing page, a multi-step scan flow, an 8-dimensional risk radar, and a history/dashboard experience—all in a single weekend—plus a custom domain hooked up at compliguard.io. For a hackathon project, that speed and polish would have been impossible for me to achieve on my own without Floot.