A Brief Review of Informal STEM Education Evaluator Competency Frameworks
Program evaluation spread to nonprofits and museums largely in the 1960s and 70s. Though program evaluation in these spaces is still relatively new, much progress has been made in developing our practices, differentiating the field from similar social science investigations, and advocating for the importance of evidence-based practice.
Certainly, articulating a set of evaluation standards is part of this development. With so many different practitioners in the evaluation field from many academic disciplines and backgrounds, it can be challenging to establish a common set of criteria to assess a broad scope of evaluation practices.
In informal STEM education evaluation, resources have been developed that aid evaluators in assessing their practices and considering areas for potential development. Let’s explore a few of them.
The American Evaluation Association (AEA), a premier U.S.-based professional organization for evaluators, developed a set of principles to which they encourage evaluators to adhere. First adopted in 1994, these principles are intended to serve as an ethics guide for evaluators. The AEA engages with its members to review and update these principles every five years to ensure they reflect the realities of an ever-changing world. Last updated in 2018, these principles are:
Systematic Inquiry: Evaluators conduct data-based inquiries that are thorough, methodical, and contextually relevant.
Competence: Evaluators provide skilled professional services to stakeholders.
Integrity: Evaluators behave with honesty and transparency in order to ensure the integrity of the evaluation.
Respect for People: Evaluators honor the dignity, well-being, and self-worth of individuals and acknowledge the influence of culture within and across groups.
Common Good and Equity: Evaluators strive to contribute to the common good and advancement of an equitable and just society.
These principles provide a sturdy foundation for evaluators to base their practice on, regardless of their focus.
The Visitor Studies Association, a professional organization serving informal learning organizations, developed a set of evaluator competencies for self-assessment in 2008. The competencies are intended to be used by professionals working in informal learning settings, including museums, zoos, nature centers, visitor centers, historic sites, and parks. The defined competencies are:
Knowledge of the principles and practices of the field of visitor studies: Professionals should be familiar with the history, terminology, developments, key publications, and major contributions of the visitor studies field.
Knowledge of the principles and practices of informal learning environments: Everyone who engages in visitor research and evaluation must understand the characteristics that define informal learning settings, and understand how learning occurs and is developed in these settings via principles, practices, and processes.
Knowledge of and practices with social science research and evaluation methods and analysis: Professionals must understand and demonstrate the appropriate practice of social science research and evaluation methods and analysis.
Knowledge of business practices, project planning, and resource management: Professionals must possess appropriate skills for designing, conducting, and reporting visitor studies and evaluation research.
Professional Commitment: Professionals should commit to the pursuit, dissemination, and critical assessment of theories, studies, activities, and approaches utilized in and relevant to visitor studies.
Though the competencies were developed specifically for visitor studies professionals, these standards can be adapted to various sister fields. For example, professional commitment is a valuable quality across many industries. Professionals can utilize this competency rubric to measure their knowledge and skills, set goals for future development, and identify learning opportunities.
The Association of Science and Technology Centers (ASTC), a non-profit organization that aims to increase understanding of and engagement with science and technology for all, produced Professional Competencies for the Informal STEM Learning Field in 2018. Developed via an NSF-funded research initiative, the competencies emerged from three observations about informal STEM learning:
Informal STEM learning work is important.
Informal STEM learning work is difficult.
Informal STEM learning work is unique.
Researchers then created and facilitated three workshops conducted at science centers across the U.S. and a national survey that measured the outcomes of the workshops for informal STEM learning professionals. These results, alongside current professional literature, became the foundation for the competencies framework.
The competencies are listed under four domains:
Institutional Operation (e.g., understanding and contributing to the institution’s mission, vision, and goals; creating structure within the institution; managing policies and practices of the institution; and providing financial leadership for programs and products).
Institutional Impact (e.g., identifying and aligning work with the institution’s audience; advancing the aspirations of the institution; influencing and contributing to equitable practices within the institution; and contributing to evaluation and research within the informal STEM learning field).
Job-Specific Expertise (e.g., develop and/or manage programs; contribute to informal STEM learning practices that maximize the impact of resources; make decisions about work based on evidence about effectiveness and efficiency; and participate in professional learning opportunities).
General Expertise (e.g., create and support practices that encourage intrapersonal reflection; create collaborative relationships; demonstrate effective communication skills; and utilize creative and analytical thinking).
This short video from ASTC discusses the competencies and how they can be utilized specifically within the ISE space.
While not directly aimed at evaluators, this competency framework is useful in understanding what skills ISE professionals should prioritize and how those skills are relevant to their daily experience. For example, a key competency in the framework is “evidence-based practice,” housed under Job-Specific Expertise. This skill directly relates to research and evaluation and its implications for ISE professionals’ ability to make decisions in their practice.
Housed under Institutional Impact, Evaluation & Research is recognized as an important domain category and ties in with the General Expertise priority of creative and analytical thinking. These competencies flex problem-solving muscles and require professionals to grapple with multi-faceted, complex issues.
This framework is currently being updated by ASTC, alongside their colleagues, the COSI Center for Research and Evaluation, and Oregon State University’s STEM Research Center.
Through 2025, the team will develop tools and resources for informal STEM education/informal STEM learning. Their blog post about the initiative stated that their goal is to “better support users in growing their professional skills and reflect field-wide changes in the workforce.” ASTC is currently collecting perspectives from those in the ISE field that will contribute to developing their upcoming collection of resources. Those who want to contribute feedback to the work can submit their interest here.
Though these frameworks highlighted above take different approaches to evaluation practices, they all prioritize the following:
Competence: professionals should have a strong understanding of the history, present practices, and upcoming trends within their field. Pursuit of professional development and continuing education is important.
Systematic and Analytical Thinking: professionals should maintain a foundation in evidence-based practices, and possess the ability to utilize evaluation methods and practices to understand and implement data in a solution-oriented manner.
Common-Good Approach: professionals should honor, value, and respect those they work with (both intra- and extra-organization), and practice in a way that advances equity, diversity, and access.
Informal STEM education and informal STEM education evaluation are hyper-specialized fields. However, they share many practices and values with other domains. We hope these tools were helpful for professionals looking to sharpen their skills or better understand the breadth of work in our spaces.
Are there other evaluation competency models that we missed? Let us know!
If you enjoyed this post, follow along with Improved Insights by signing up for our monthly newsletter. Subscribers get first access to our blog posts, as well as Improved Insights updates and our 60-Second Suggestions. Join us!