Evaluation: Supporting Exhibits
Below are exhibits documenting the management and application of evaluation expertise based on recent work efforts.

I have also served on review and evaluation panels, which have allowed me to apply knowledge in the fields of science education, curriculum and development, professional development, e-Learning, and instructional design. For those experiences, please see the section of my portfolio titled: Advisory and Grant Review Panels.




 
End-User Review Templates
Formative Evaluation Feedback for SciPacks
Created end-user review templates for original SciPacks and Science Objects. Document shows review questions for both first and second round reviews, as administered early on in the program via web and email. The questions that are asked of an end-user are targeted to the feedback in areas such as sequencing, pacing and motivational potential. I have instilled in our team the necessity to make end-user formative review a critical feedback loop on all e-PD resources we create within the NSTA Learning Center.






 
Expert Review Template
Formative Evaluation Feedback for SciPacks
Created expert review template for original SciPack and Science Object production process. Document shows review questions critical for review when addressing content that needs for standards-based development. After a Design Scope Document is created to "unpack" and identify the science content standards addressed in an NSTA SciPack, and a content author applies the ID templates, content expert reviews are a critical part of the production process to ensure content accuracy, currency, breadth and depth, as discussed in many formative evaluation ID development guidelines.




pdf icon
  Learning Center e-PD Focus Groups, Baltimore, MD
Problem Analysis
In November of 2006 NSTA conducted the second series of focus groups looking at the NSTA Learning Center and its e-PD resources. I helped design the parameters of the focus group and the questions that would be asked given the current level of the portal at that time. I also served as one of the silent observers/recorders of event. The Learning Center was in beta release with about 800 metatagged items and only the "My Library and "My Notepad" version 1.0 completed at this time. With both administrators and teachers we confirmed the value of SciPacks and SciGuides and the benefit of a "virtual bookshelf" to access digital resources linked to "certification" from a final assessment. Price points were also explored that were later adjusted for market value. I served in an observer capacity during the focus groups.



pdf icon
  Likert Survey and SciPack Focus Group, West Virginia Department of Education
Criterion Reference Measurement
In October of 2006, I helped design a teacher feedback Likert scale survey that was administered by the State Science Assessment Specialist for the Department of Education in West Virginia in conjunction with a face-to-face focus group for teachers who had completed the Force and Motion and Gravity and Orbits SciPacks as part of a 50 teacher pilot study. The goal was to gauge teacher perception as to learning science content knowledge via NSTA SciPacks. Both quantitative and qualitative feedback summaries are provided in this brief evaluation report. This data provided critical feedback for the SciPack development process. West Virginia is now in its third year using SciPacks as part of their Math/Science Partnership PD plan for middle level and special education teachers.




pdf image
  Three District SciPack Pilot: Charlotte-Mecklenburg, Omaha, and Everett Public School Districts
Criterion Reference Measurement
In the spring of 2006, I designed, implemented and managed a pilot study of our first completed SciPack on Force and Motion. It employed a quasi-experimental design with non-randomized in-tact groups of teachers from three separate districts. We collected data from numerous sources such as teacher feedback surveys, administrator interviews, teacher log usage, a final assessment at the end of the SciPack, and a pre/post third party assessment instrument administered by Horizon Research, Inc. I generated and secured IRBs from all participants in cooperation with district administrators. With the help of my executive producer, Steve Rapp we were able to receive valuable feedback to guide future development. This data was later analyzed by Dr. Greg Sherman and published in the Journal of Science Education and Technology online in October 2007 and in print in February 2008.






 
SciPack Focus Group and Usability Study
Formative Evaluation Feedback for SciPacks
In March of 2005, I managed several focus group to guide the development of critical features, establish price points, and confirm overall worth of the product we were developing. We also desired specific end-user feedback on the navigability of the GUI for our SciPacks and Science Objects. I secured the services of a third-party web firm to assist my staff and I in this endeavor . We invited teachers from the surrounding school districts that matched our target population and invited them to NSTA Headquarters typically beginning in the late afternoon or on a Saturday. The results of one of these focus groups are included here. It is absolutely critical to gather systematic feedback from the target audience you seek to design for. These focus groups were conducted after extensive early research documenting the need for the product we were designing, which included market research analysis, a review of the competitive landscape, and calls from the literature.





 
SciPack Focus Group: University of Wisconsin Partnership
Formative Review
In the summer of 2005, conducted 1-day focus group with 16 teachers to gather feedback on formative usability of SciPacks that included degree of interactivity, desired features, and robustness of platform. Focus included both written feedback, observations as teachers went through the SciPacks, small group discussion and informal expert review feedback from the PI's conducting the face-to-face professional development in Wisconsin.




pdf icon
  SciGuide Review, NSTA PD and Middle Level Committees
Summative Review
In 2005, NSTA desired to conduct a summative pedagogical expert review of SciGuides by members of our professional development and middle level committees. Members are appointed to these committees by the current NSTA President and are leaders in their field. They comprise of a cross-section of K-12 master educators, national board certified teachers, science education professional consultants, and science education university professors. A panel of nine responded to a series of questions enclosed herein. I lead this process and crafted the questions the senior staff at NSTA agreed upon for the review. We used this data to guide our decision to continue to develop SciGuides given their perceived worth from this qualified national committee members.



pdf icon
  WebWatcher SciGuide PD Evaluation Report: NASA Explorer School Teacher Workshop
Criterion Reference Measurement
In 2004, I lead a team of NSTA consultants to deliver a 2 day PD workshop across different NASA centers as part of their overall face-to-face professional development experience for NASA Explorer School educators. We utilized exemplary educators who were group leaders from the prior NSF WebWatcher grant experience. The goal was to share the SciGuides and lessons learned with other educator groups. This was to be accomplished by creating a personal SciGuide using our WYSIWYG web-based content management system and applying search techniques and rubrics to evaluate selected NASA web sites.

I learned that taking a 2-day face-to-face training model and migrating it primarily online with a volunteer community is challenging. Also, access to the NASA scientists became a problem, so the model was not as successful as originally hoped. We learned valuable lessons and selected the SciGuides with the most promise to develop for public consumption with our NSTA staff and production process.



pdf icon
  A Review of the PD Landscape
Problem Analysis
This PowerPoint presents an analysis of the PD landscape and supports the need for the NSTA Learning Center. It addresses what the literature is calling for and incorporates the "gaps" identified in a recent report by Alan Ginsburg from the US Department of Education after surveying Online Professional Development in Mathematics. Reading the quoted citations demonstrates how the Learning Center and its e-PD resources and tools is applying knowledge of current trends and stated need in professional development for science education.



pdf icon
 

Learning Communities Web 2.0 Presentation
Problem Analysis
This presentation looks at current trends in e-consumption via the web and what viral features help "tip" usage. I stay informed through books like Wikinomics, The Tipping Point, and The Long Tail, as well as reviewing other competitors e-Learning portals. I use this knowledge to make recommendations at the senior level of the organization and translate this into action as part of the collective discussion.

Example:
In in our new iTunes e-commerce B2C model for the Learning Center we utilized the NSF metatagging grant to allow our resources to be discovered within the NSDL, and to build the back-end infrastructure that tags all our resources for individal sales. Prior to this an individual had to be a member of NSTA to gain access to our 3,000 articles across four journals. Within the Learning Center we maintain about 25% of the quality content for free and have recently seen back-to-back record historic quarter growth in sales once the article prices were dropped from $4.99/article to $.99 cents/article. For an "e-Learning" Web 2.0 world is is about the 4 C's: Converge, Communicate, Collaborate or Co-Create and Captivate.




pdf icon

 

Al Byers NSTA e-Strategy
Long Range Planning
Taking into account the current trends in web 2.0 technologies I refined the "4C's" e-strategy with specific recommendations we might take within NSTA:

  • Convergence: Assimilate our assets/resources into a single point of access
  • Communication: Open up our member-only listserv to those beyond membership. Use Members to proselytize our assets to non-members
  • Collaboration or Co-Creation: Learners want more control in how they work with content. Localization, Contextualization of the content is good, but be wary of quality control with respect to content accuracy in a "mashable" environments, e.g., wikipedia
  • Captivation: Expected outcome to monitor as one achieves the first 3 C's



pdf icon

pdf icon

pdf icon

pdf icon

pdf icon




 

NSTA Strategic Planning Documents
Long Range Planning
The earliest efforts at developing a strategic vision by the governance of NSTA occurred in 2002 with a PD Task Force Report. The major implications for NSTA were to move beyond isolated conference experiences and generate offerings directly to school districts. This charge has been completed with the NSTA Learning Center and targeted marketing of our publications to school districts. Led by Frank Owens, Associate Executive Director of Professional Programs, we analyzed our existing suite of PD resources and offerings and classified them in a matrix by objectives and customers in attempts to qualitatively look at the potential growth over a 5 year horizon. From this work I drafted a strategic analysis and plan for my specific department with articulated goals to create an e-PD system. Then in 2005 NSTA published "Strategy 2005" that articulated our strategic goals and objectives for the next 5 years. The NSTA Learning Center and its resources were then aligned to these objectives and progress was tracked twice a year in an "Executive Update" from our Executive Director to the Board of Directors. I have included a sample related just to my area from June 2006.

The documents from top to bottom:

  1. PD Task Force Report (Word)
  2. Scale and Sustainability Program Inventory (Word)
  3. Government Partnerships and e-Learning Strategic Plan Document (Word)
  4. Strategy 2005 Goals and Objectives (PDF)
  5. Executive Office Report June 2006 for e-Learning (Word)



pdf icon
  National Science Foundation Expert Panel Review: Anticipating the Role of Emerging Technologies in Science Education
Long Range Planning
In 2002, I was asked by Mike Haney at NSF to serve on a review panel related to technology and science education. This was my first expert panel review and a tremendously worthwhile experience. I learned an incredible amount about current advancements in the field of science and technology at the time, as well as the rigor and protocols that are executed to make critical evaluation decisions. I was also able to make invaluable connections with the other reviewers. The attached file shows 3 review of the dozen or so I contributed to this effort. I have recommended and coached a similar review process for several other projects within NSTA, most recently, NASA's Explorer School program in 2008, who desired to an expert panel to review and comment upon their program and recent NRC review. These review panels inform our strategic plan and as such, I make a concerted effort to participate many Advisory and Expert Review panels.



pdf icon
  SciPack Third Party Evaluation Report: West Virginia Department of Education
Summative Evaluation
In May of 2007 a third party evaluator was commissioned to analyze the data gathered by a pilot study conducted by the West Virginia Department of Education that utilized NSTA SciPacks to help teachers increase their science content knowledge. This pilot included a component utilizing a student item assessment test bank, where teachers were to augment their instruction as a result of the knowledge gained by completing two SciPacks. The results were very encouraging with teachers demonstrating significant gains in content knowledge, increased efficacy in their teaching, and in certain cases, increased student learning. The challenge was in executing the plan as originally designed. There seems to be challenges in executing the fidelity of a rigorous study when involving public education classrooms. It is a significant undertaking that requires full time dedicated resources, incentives, monitoring, and facilitation. This evaluation results confirmed the impact of SciPacks and WV continues to support this offering to approximately 100-200 teachers/year.


pdf icon
  Webwatcher Third Party Rubric Evaluation
Formative Analytical Rubric Analysis
Managed and secured evaluation of SciGuide rubric reliability study after reviewing research and working with educators in the development and drafting of the rubrics. Report by Dr. Maurya Schweizer from Virginia Tech. Inter-rater reliability coefficients were increased through iterative improvement to a degree satisfactory for their intended purpose, which is to differentiate and guide teachers in the selection and application of web-based URLs that emphasize certain desirable characteristics for the classroom instruction, e.g., inquiry, interactivity, communication/collaboration, etc.