The evaluation service uses a systems-based, proprietary approach to evaluate the implementation and impact of a course, program, project, or initiative. Many times, evaluations focus on measuring the impact of a single factor or small set of factors, but programs and projects are rarely (if ever) that simple. Consequently, evaluation results tend to underwhelm clients’ goals for impact.

Instead, we specialize in a systems-based approach to STEM and education programs. Our approach layers 17+ years of domain expertise with systems engineering approaches and methods to effectively evaluate the constellation of factors that influence outcomes. We work closely with the client to shape the evaluation study and bring clarity and coherency to the program design.

Evaluation activities focus on:

  • Systems-based evaluation framework to underpin evaluation design

  • Facilitated logic model and theory of action design discussions

  • Collaborative evaluation planning

  • Qualitative and mixed-methods approaches

  • Integrative evaluation designs that compliment program implementation

  • Discipline-specific tools and assessments

  • Fidelity of implementation studies

  • Formative evaluation (continuous improvement)

  • Summative evaluation (impact and assessment)

Evaluation engagements look like:

  • Establishing the evaluator-client agreement as early as possible in the RFP cycle and/or program design. We do this through:

    • A conversation to establish fit between evaluator’s expertise and client’s needs, and review the opportunity

    • A clarifying conversation (in the same one as mentioned above bullet point, or in a follow up conversation) about the evaluator’s role on project/program team

    • Negotiation of terms and contract

  • Once contracted, evaluation study and publications activities include:

    • A series of program-specific discussions to clarify program or intervention design and establish evaluation study design

    • Implement program evaluation that includes: creating the evaluation plan, data collection, analysis, continuous improvement feedback, reporting and assessment results, publications

Timeframe: 1-5 years, depending on scope and objectives