For Universities

Accreditation-ready evidence.
Built into every assessment run.

Graduate outcome frameworks and professional accreditation bodies require evidence of authentic skill development. We generate that evidence automatically — at the granularity of individual students, individual criteria, and individual milestones.

ACS

Australian Computer Society

Aligned with the ACS Core Body of Knowledge

The ACS CBOK defines what ICT professionals need to know and do. Our platform operationalises the professional practice dimensions — making them assessable, measurable, and reportable at scale.

ICT Problem Solving

Requirements traceability, test design, and iterative delivery are assessed directly through issue management, milestone completion, and test coverage trends.

Collaborative & Communication Skills

Code review participation, PR description quality, and peer feedback engagement are scored per student — making collaboration a measurable outcome, not an assumption.

Professionalism

Commit discipline, branching practices, CI ownership, and deadline management are the operational definition of professional conduct in software engineering.

Ethics & Academic Integrity

Automated cross-repo similarity detection via JPlag provides systematic, evidence-based integrity checking across the full cohort at every milestone.

EA

Engineers Australia

Mapped to the Stage 1 Competency Standard

Engineers Australia requires graduates to demonstrate engineering application ability and professional attributes. Our platform provides objective, per-student evidence against six Stage 1 competencies.

Stage 1 · 2.2

Engineering techniques, tools and resources

Students are assessed on fluent use of version control, CI/CD pipelines, code review workflows, and issue tracking — the toolchain of professional software engineering.

Stage 1 · 2.4

Systematic management of engineering projects

Milestone planning, task decomposition, and incremental delivery are scored per student, providing evidence of systematic project management practice.

Stage 1 · 3.1

Ethical conduct and professional accountability

Academic integrity detection and individual contribution tracking ensure that each student is accountable for their own demonstrated practice.

Stage 1 · 3.2

Effective oral and written communication

Commit message quality, PR descriptions, and code review comments are analysed for clarity, specificity, and professional register.

Stage 1 · 3.5

Orderly management of self and professional conduct

Commit distribution patterns, deadline behaviour, and consistent engagement over the semester provide objective evidence of self-management.

Stage 1 · 3.6

Effective team membership and leadership

Per-student review activity, responsiveness to feedback, and contribution equity within teams surface the individual dynamics that team grades conceal.

Graduate Outcome Frameworks

From intended outcomes to demonstrated evidence.

Threshold Learning Outcomes and graduate attribute frameworks require evidence that students have developed, not just been exposed to, professional competencies. We close the gap between curriculum intent and assessment reality.

“Our platform operationalises what good software engineering education has always aimed at: authentic assessment of professional practice, at scale, with evidence that satisfies both internal quality assurance and external accreditation requirements.”

Threshold Learning Outcomes (TLOs)

Software engineering TLOs require students to demonstrate application of professional methods and tools. Our rubric maps directly to these outcomes — every criterion is traceable to a TLO.

TEQSA Higher Education Standards

TEQSA requires evidence of student learning outcomes at the unit level. Our per-milestone, per-criterion, per-student data provides the granularity that annual reporting demands.

Institutional Graduate Attributes

Attributes like professional responsibility, teamwork, and lifelong learning are often assessed anecdotally. We make them measurable through git-level behavioural evidence.

Curriculum Mapping & Improvement

Aggregate criterion performance data reveals where the curriculum is working and where it is not — enabling evidence-driven curriculum review rather than intuition-based adjustment.

Quality Assurance

Evidence your QA committee will trust.

Every claim is backed by a timestamped, auditable record from the student's own repository.

Timestamped evidence trails

Every score is backed by a git-level evidence trail — commit hashes, PR timestamps, CI run IDs. Defensible, auditable, and ready for panel review.

Cohort-level aggregate data

Criterion-level pass rates, grade distributions, and milestone trend data support annual unit reviews, accreditation self-studies, and curriculum improvement cycles.

Consistent standards across markers

Automated rubric application eliminates inter-marker variability. Every student in the cohort is assessed against the same criteria with the same rigour.

Exportable for accreditation reporting

Cohort performance data exports in structured formats suitable for submission to ACS, Engineers Australia, TEQSA, or internal quality assurance committees.

Ready to strengthen your accreditation case?

We will walk you through how our platform maps to your specific accreditation requirements and generate a sample evidence report from your own course data.

Book a 30-min Demo

Institutional and faculty licensing available.