Evaluation: Professional Experiences
Assistant Executive Director, e-Learning and
In every capacity of what I manage a degree of evaluation is occurring with an “eye” to timeliness, high quality and budget. Formalized iterative review constructed as part of the production process is paramount for all professional development offerings and e-Learning resources we produce. A constant tension exists between time to market, production/delivery costs and quality. If one increases quality or decreases time to market, cost typically go up, and the opposite can be said to be true as well. Evaluation is a process that helps maintains a balance between the three. It is important to not sacrifice the good at the sake of the perfect. To maintain this balance we conduct both formative evaluation (to improve the effectiveness of the offering during production) as well as summative evaluation to determine the overall worth and impact of the resource after it is deployed.
I have been fortunate to serve on several grant review panels and advisory committees during my tenure at NSTA, which allows me to engage in executive review and administration of other programs outside of NSTA as well.
Lead initiative at NSTA to design and evaluate customized NSTA LMS that integrates with a third-party LCMS solution through leading a rigorous RFP evaluation process at NSTA to create a sustainable and scalable e-Professional Development portal called The NSTA Learning Center. See following presentations for progression and refinement of e-PD portal system as drawn from literature regarding LMS/LCMS, SCORM initiatives, and feedback from numerous professional presentations, one-on-one expert reviews, and focus groups throughout its development:
Facilitated significant formative evaluation with both internal NSTA personnel and external constituents in the design of the portal, which included both formal and informal feedback loops at many stages of development.
Based of the current challenge of scaling PD on a national level, utilized research and SCORM learning objects to create initial templates and production model for what is now known as Science Objects and SciPacks.
- Facilitated building and managing production team at NSTA, and hosting "Boot Camp" to get initial consultant talent pool for project.
- Instrumental in creating and evaluating research-based production model to support our design efforts that incorporated rigorous formative evaluation and design based off best practices and research. See following web site for:
- Created initial formative review documents for these resources and administered several focus groups on early prototypes of SciPacks.
- Initiated, designed and administered three district pilot with the first SciPack created, Force and Motion that resulted in a publication in the Journal of Science Education and Technology.
Served as primary author, Co-PI, and project manager of a two year $847,000 National Science Foundation National Science Digital Library (NSDL) grant that created the back-end infrastructure to metatag all NSTA's digital resources to IEEE LOM and Dublin Core extended tags that are now auto-harvestable via OAI servers and the NSDL. Challenge was how to metatag NSTA's digital assets and present them via a system that ensured content was searchable, findable, and ultimately sustainable on a national level via a web-based delivery portal. Partnered with Ohio State University and assembled advisory panel of review experts, third party evaluator, usability expert, and accessibility evaluator for project to inform our efforts. In PI role served as final quality control reviewer throughout all states of the grant period. See the following documents:
I began my career full time as Director of the NSTA Institute, which at the time was was a web page where several universities could place a link to their online course offerings. While this was primarily a marketing partnership, we have now built the NSTA Learning Center with NSTA's own suite of e-PD offerings that include blended symposia, online short courses, interactive self-paced modules, and live web seminars in addition to a robust suite of tools to assist the teacher in planning and documenting their growth over time.
The first e-PD offerings in 2003 were NSTA symposia that soon provided online follow-up via web seminars. I generated the initial user feedback surveys and began to capture this feedback both as an initial step to document the impact of the program for its improvement, and also to support future funding opportunities. One may view the entire history of the program, attendance and user feedback for both Symposia evaluations and the Web Seminar evaluations that were initially funded via a NASA peer-reviewed grant after I planned and implemented a "History of Winter" live web seminar experience with NASA from Lake Placid, NY in February 2004. I later commissioned a third party evaluation to analyze these data as well as the post conference follow-up for our 2008 national conference symposia. We found that a majority of the teachers found the experiences favorable and implemented the resources from the symposia in their classroom. Similarly, a majority showed significant gains in pre/post assessments from the on-site component of the blended PD experience.
NSTA SciGuides Resource
For the last two years of a three year $1.3 million National Science Foundation Grant called NSTA WebWatchers. The project goals were to train a cadre of educators, as well as design a coding system to help teachers better search, evaluate, and, integrate the use the Internet in their teaching. Project and process created what is now known as NSTA SciGuides.
Asked to direct the project two months before the second summer series of workshops. Reviewed the original grant proposal, and the year one evaluation report from Horizon Research to quickly gain an understanding of the goals, milestones, and current issues facing the project, as well as current literature calling for effective integration and training in use of the Internet for the classroom. Final NSTA Report to NSF provides our overview of impact.
See below for evaluation efforts I brought to the project:
- Migrated the project from using simple evaluation criteria of URLs to a series of eight valid and reliable three-level analytical rubrics based off the literature at time and rubric evaluation work developed by another NASA grant project called Networking for Leadership, Inquiry, and Systemic Thinking. See publications and proceedings:
- Secured and managed consultant team and process to conduct reliability and content validity reviews to ensure a more consistent and accurate level of evaluating selected web sites. These rubrics are now incorporated into the NSTA SciLinks and NSTA SciGuide programs.
- Managed and outsourced usability studies to drive the layout and design of the SciGuide components via a drill-down hierarchy to access the vetted URLs, rubric ratings, icon tag filters, and national standard alignments, making sure site also met section 508 accessibility requirements. Through rigorous and iterative formative and summative evaluation processes created an award winning product that is of value to the educators integrating the Internet into their classroom. View original multimedia overview of product: http://sciguides.nsta.org/demo/index.html. Attached PowerPoint also shows highlights of several third party evaluations conducted during the product's life cycle.