Recent Changes

Tuesday, April 19

  1. page Time Log edited ... 4/18 1.5 Completed Dawn Content of Evaluation page 4/17 2.5 Completed
    ...
    4/18
    1.5
    Completed
    Dawn
    Content of Evaluation page
    4/17
    2.5

    Completed
    (view changes)
    6:57 pm
  2. page Evaluation edited ... For this project, we have attempted to capture a measure of both types of evaluation through t…
    ...
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    The nature of several survey questions address formative evaluation by asking for feedback related to the design of the tutorial itself, specifically by prompting users to use the space provided "to tell us what you liked best about the tutorial or what aspects of it you found to be the most helpful" and "tell us what you liked least or found the least helpful about the tutorial." Other questions approximate summative evaluation by asking users to self-rate, via a Likert scale response, their ability to successfully complete a screencast as a result of having completed the tutorial and asks for this information as it specifically relates to the various aspects of screencasting addressed in the tutorial. An example of these questions asks users to provide a response ranging from "I am very confident that I could make the necessary plans for an effective screencast" to "I am confident that I could NOT make the necessary plans for an effective screencast."
    ...
    cohort of 12twelve USC faculty
    Evaluation Survey Results:
    Familiarity with screencasting before the tutorial:
    (view changes)
    6:46 am

Monday, April 18

  1. page Time Log edited ... Evaluation 4/18 .5 In Progress 1.5 Completed
    ...
    Evaluation
    4/18
    .5
    In Progress
    1.5
    Completed

    (view changes)
    8:04 pm
  2. page Evaluation edited ... For this project, we have attempted to capture a measure of both types of evaluation through t…
    ...
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    The nature of several survey questions address formative evaluation by asking for feedback related to the design of the tutorial itself, specifically by prompting users to use the space provided "to tell us what you liked best about the tutorial or what aspects of it you found to be the most helpful" and "tell us what you liked least or found the least helpful about the tutorial." Other questions approximate summative evaluation by asking users to self-rate, via a Likert scale response, their ability to successfully complete a screencast as a result of having completed the tutorial and asks for this information as it specifically relates to the various aspects of screencasting addressed in the tutorial. An example of these questions asks users to provide a response ranging from "I am very confident that I could make the necessary plans for an effective screencast" to "I am confident that I could NOT make the necessary plans for an effective screencast."
    ...
    of Education, havehad completed the
    Evaluation Survey Results:
    Familiarity with screencasting before the tutorial:
    ...
    We plan to monitor the feedback collection process for any additional responses that may become available in the near future as we expect that a larger number of responses may more clearly establish items of consistent concern to users. From the information gathered already, there are issues that we clearly need to address and chief among those is to clarify the interface differences between Mac and PC versions of Camtasia. Additionally, we will be looking at places in the videos where there is no narration to determine if there is content that needs to be provided that has not been. Other issues, such as the benefit of combining the content into one video vs. maintaining them as separate resources as they presently are would likely be best kept under consideration until additional feedback is obtained along with monitoring the pattern of responses associated with the Editing video to determine if users continue to frequently rate their confidence in successfully completing that phase with the lowest possible score. Another issue that could merit attention is the possibility of providing greater context for the videos by providing supplemental text on the main tutorial pages.
    Overall, at this stage of the Evaluation Phase, which followed a necessarily brief Implementation Phase and involved learners with a wide range of technical and computer skills, we are pleased with the results and plan to continue to obtain more formative and summative evaluation as more USC College of Education staff and faculty have time to complete the tutorial.
    Sources:
    Brown, A. & Green, T. D. (2006). The essentials of instructional design: Connecting fundamental principles with process and practice. Upper Saddle River: NJ: Pearson

    (view changes)
    8:03 pm
  3. page Evaluation edited ... Evaluation is the process of determining the effectiveness of the instruction. Summative evalu…
    ...
    Evaluation is the process of determining the effectiveness of the instruction. Summative evaluation is the review of the finished instructional product. Other types of evaluation take place in earlier stages of the Dick and Carey model.
    To elaborate on the terminology, the following are generally accepted definitions of evaluation and its sub-types.
    Evaluation:Evaluation:
    Evaluation, the process of determining a system's effectiveness, is typically split into three categories:
    formative evaluation: the evaluation of the instruction performed while the instruction is being formed
    ...
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    The nature of several survey questions address formative evaluation by asking for feedback related to the design of the tutorial itself, specifically by prompting users to use the space provided "to tell us what you liked best about the tutorial or what aspects of it you found to be the most helpful" and "tell us what you liked least or found the least helpful about the tutorial." Other questions approximate summative evaluation by asking users to self-rate, via a Likert scale response, their ability to successfully complete a screencast as a result of having completed the tutorial and asks for this information as it specifically relates to the various aspects of screencasting addressed in the tutorial. An example of these questions asks users to provide a response ranging from "I am very confident that I could make the necessary plans for an effective screencast" to "I am confident that I could NOT make the necessary plans for an effective screencast."
    ...
    is our primary intended means
    Evaluation Survey Results:
    Familiarity with screencasting before the tutorial:
    (view changes)
    6:34 pm
  4. page Evaluation edited ... learner evaluation: determining the performance change of the learners due to the instruction …
    ...
    learner evaluation: determining the performance change of the learners due to the instruction implemented
    Formative Evaluation:
    ...
    system of revision.Formativerevision. Formative Evaluation consists
    design review - checks whether the instruction that is designed meets the analysis
    expert review - A Subject-Matter Expert (SME) will review the implementation to determines if the content is accurate and consistent
    ...
    ongoing evaluation - continually examines the design with respect to possible change in content, audience, or size
    Summative Evaluation:
    ...
    changes have occurred.Inoccurred. In short, have
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    The nature of several survey questions address formative evaluation by asking for feedback related to the design of the tutorial itself, specifically by prompting users to use the space provided "to tell us what you liked best about the tutorial or what aspects of it you found to be the most helpful" and "tell us what you liked least or found the least helpful about the tutorial." Other questions approximate summative evaluation by asking users to self-rate, via a Likert scale response, their ability to successfully complete a screencast as a result of having completed the tutorial and asks for this information as it specifically relates to the various aspects of screencasting addressed in the tutorial. An example of these questions asks users to provide a response ranging from "I am very confident that I could make the necessary plans for an effective screencast" to "I am confident that I could NOT make the necessary plans for an effective screencast."
    (view changes)
    6:32 pm
  5. page Evaluation edited ... Summative Evaluation is carried out after an instructional design has been implemented. It tes…
    ...
    Summative Evaluation is carried out after an instructional design has been implemented. It tests the effectiveness of the design and seeks to determine is if the desired changes have occurred.In short, have the goals and objectives been met?
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    ...
    nature of theseveral survey questions address
    For the purposes of completing this assignment, collection of feedback data was cut off at noon on April 18, 2011. As of that point, six individuals from the learner group, which is a pre-identifed cohort of 12 USC faculty and staff members within the College of Education, have completed the feedback survey and two have submitted links to their completed screencasts, which is our intended means of summative evaluation. The details of their comments are provided below and available here as a pdf: {STARS+Evaluation+Survey+Results.pdf} .
    Evaluation Survey Results:
    (view changes)
    6:31 pm
  6. page Evaluation edited {EvaluationLogo.jpg} The The ultimate purpose The STARS team has, throughout the development …
    {EvaluationLogo.jpg} TheThe ultimate purpose
    The STARS team has, throughout the development process, followed the approach to instructional design outlined by Dick & Carey. Interative evaluation and revision throughout the design and development process is a hallmark of the Dick & Carey model, however it does conclude with a formal, specified "Evaluation" phase. Specifically, this is what the model states about this phase:
    Evaluation
    ...
    learner evaluation: determining the performance change of the learners due to the instruction implemented
    Formative Evaluation:
    ...
    system of revision. Formativerevision.Formative Evaluation consists
    design review - checks whether the instruction that is designed meets the analysis
    expert review - A Subject-Matter Expert (SME) will review the implementation to determines if the content is accurate and consistent
    ...
    ongoing evaluation - continually examines the design with respect to possible change in content, audience, or size
    Summative Evaluation:
    ...
    been met?
    For this project, we have attempted to capture a measure of both types of evaluation through the use of a feedback survey and sample assignment that requires utilization of skills presumably learned as a result of having completed the tutorial. Two participants provided links to screencasts that they had successfully created. However, both of the respondents also indicated that they had prior experience using Camtasia, so it is not possible to conclude that the tutorial played a role in the successful completion of their screencasts.
    The nature of the questions address formative evaluation by asking for feedback related to the design of the tutorial itself, specifically by prompting users to use the space provided "to tell us what you liked best about the tutorial or what aspects of it you found to be the most helpful" and "tell us what you liked least or found the least helpful about the tutorial." Other questions approximate summative evaluation by asking users to self-rate, via a Likert scale response, their ability to successfully complete a screencast as a result of having completed the tutorial and asks for this information as it specifically relates to the various aspects of screencasting addressed in the tutorial. An example of these questions asks users to provide a response ranging from "I am very confident that I could make the necessary plans for an effective screencast" to "I am confident that I could NOT make the necessary plans for an effective screencast."
    ...
    a pdf: {STARS Evaluation Survey Results.pdf}{STARS+Evaluation+Survey+Results.pdf} .
    Evaluation Survey Results:
    Familiarity with screencasting before the tutorial:
    ...
    1=Very confident they could complete the stage of development
    5=NOT confident they could complete the stage of development
    {Screen_shot_2011-04-17_at_3.05.25_PM.png}
    The following is a list of the text responses provided to the open-ended questions within the survey:
    Additional questions about Camtasia Studio not addressed in the tutorial:
    (view changes)
    6:29 pm

More