Overview

We created our Program Assessment product to help institutions measure the effectiveness of their learning programs.  This builds upon our Course-Level Assessment product by allowing programs to coordinate and execute a second set of reviews outside of the course using a different rubric and different reviewers.

Before you get started

You might find it helpful to configure some of the following items within the Assessment Library before diving into Program Assessment.  In the process of setting up your assessments, you’ll need to reference programs, rubrics and learning outcomes.  

Create an assessment

Start by navigating to the program assessment zone within the EDU Platform.  There you’ll see a list of the existing assessments within your institution.  Click on the New Program Assessment button to get started.

You’ll be presented with a form to define your assessment.  After naming it and providing a description, select the program that you’re measuring.  You can also configure the assessment for anonymous reviews, which will hide the student’s name from the reviewers.  

The next step is defining the review strategy for the assessment.  This controls the relationship between the artifacts and the reviews that are to be performed.  

There are four review strategies you can use for your assessment - each of which have differing characteristics.  Here is a short summary of each option:

Single Review, Unassigned:  In this strategy, a single review will be performed for each artifact and the reviews are not pre-assigned.  You can think of this as a first come, first serve approach.

Single Review, Pre-Assigned:  In this strategy, a single review will be performed for each artifact and the reviews are pre-assigned.  This allows you to divide up 100 reviews over 10 reviewers, who will then perform 10 reviews each.

Multiple Reviews, Comprehensive:  In this strategy, every reviewer in the assessment can provide a review for each artifact in each collection (everyone reviews every artifact).  

Multiple Reviews, Pre-Assigned:  In this strategy, the coordinator sets the number of reviews that are required for each artifact and then the total number of reviews to be completed are divided up evenly amongst the reviewers.  If there are 100 artifacts, 2 reviews per artifact and 5 reviewers, the 200 reviews will get divided evenly across the reviewers, who would each end up with 40 reviews to complete.

If you’re not sure which review strategy works best for you - don’t worry.  You can change the review strategy later without losing data.  Finally, click ‘Create Program Assessment’ to create your new assessment.

Create a collection

You’ll now see the program assessment that you have created.  The first thing you’ll want to do is create a new collection of artifacts to review within your assessment.  Click on the ‘+ Collection’ button to continue.

Collections are containers of work samples that should be evaluated using the same rubric.  When you create a collection, you’ll be asked to give it a name and a description.  You’ll also need to select the rubric that will be used to evaluate the artifacts within this collection.

Your first collection is now created and ready to use.  We’ll come back to building out the collection artifacts after we configure the team & roles for your assessment.  To get started with that, click on the 'Team & Roles tab'.

Define team and roles

When you created the assessment, the system added you as a coordinator.  Our model has two different types of roles:  coordinators and reviewers.  Coordinators are able to manage and change the configuration of the assessment.  Reviewers are able to review and score artifacts.  

To add team members, click on the ‘Add Role’ button.  This will present you with a form to search for and select team members to add to your assessment.  With each team member you add, you can elect to make the team member a Coordinator, Reviewer or Both. 

Note:  You will need to have a minimum number of reviewers equal to the number of reviews per artifact when using a multiple reviewer review strategy.

In this example, we have added two reviewers alongside our one coordinator:

Importing artifacts

Now that we have our team structured, let’s return to our collection and start adding artifacts into the collection.  Click on the ‘Collections’ tab and then click into our first collection:

We can start the process of adding artifacts into the collection by clicking on the ‘Import Artifacts’ button.  This will present us with several options:

Import from course assignment:  

This option will allow you to import student work samples from prior course work completed on Portfolium.  When your school has been using Portfolium for Course Assessment, you can easily migrate assignment submissions into your collection without having to export/import files or assets.  

Note:  This option will maintain an association from the imported artifacts to the student and the course in which it was created.  This will provide additional capabilities for reporting on the assessment results.

Upload a file:

This option will allow you to import individual files or groups of files as artifacts for your collection.  For groups of files, you can upload a zip file (compressed archive) containing files and folders.  All of the files within the archive (regardless of the nesting within folders or subfolders) will be imported.

Note:  This option will generally not maintain an association from the imported artifacts to the student and the course in which it was created. 

Import Blackboard submissions:

This option will allow you to import a set of files from an export file format from Blackboard.  The system will use the information embedded in the export file to correlate student IDs from Blackboard with student records in Portfolium.  The system will also correlate multiple files from the same student into single artifacts within the collection.

Note:  This option will maintain an association from the imported artifacts to the student and the course in which it was created.  This will provide additional capabilities for reporting on the assessment results.

Import Canvas submissions:

This option will allow you to import a set of files from an export file format from Canvas.  The system will use the information embedded in the export file to correlate student IDs from Canvas with student records in Portfolium.  The system will also correlate multiple files from the same student into single artifacts within the collection.

Note:  This option will maintain an association from the imported artifacts to the student and the course in which it was created.  This will provide additional capabilities for reporting on the assessment results.

Import from Canvas:

This option will allow you to connect to your Canvas instance to select courses and assignments to import into the collection.  You’ll be asked to authenticate with your Canvas instance and then be able to use our interface to search for courses and then assignments to import. 

Note:  This option will maintain an association from the imported artifacts to the student and the course in which it was created.  This will provide additional capabilities for reporting on the assessment results.

For this example, we’ll move forward with importing from a prior course assignment.  After selecting that option, you’ll be prompted to select the course and assignment to import.  You can also choose to link that assignment to the collection such that any future submissions to that assignment are added to the collection.

Artifacts added from course assignments will get added to your collection immediately.  Artifacts added through the other options are processed separately and will generally be added to the collection within a minute.  

After importing the artifacts, your collection will look something like this.  The next step is to create a sample set for your collection.  The purpose of the sample set is to allow you to assess a subset of the total artifacts that were imported into your collection.  

When you click on ‘Create Sample Set’, you’ll see three options for reducing your collection into a sample set.  All three options can be used to determine the size of your sample set. 

 

Suggested method:

This method uses a statistical formula to find the minimum sample size for a statistically relevant sample set.    For this model, we are using the following parameters:

  • Margin of error:  5%
  • Confidence level:  95%
  • Response distribution:  50%

Percentage method:

This method allows you to provide a percentage value and then your sample set will be created as the supplied percentage of the overall number of artifacts within the collection.

Absolute method:

This method allows you to provide an absolute value (number) and then your sample set will be created as the supplied percentage of the overall number of artifacts within the collection.

Once you have created your sample set, you’ll be able to see the size of the sample set and a view of the artifacts that were selected for the sample set.

Managing reviewers and review tasks

Coordinators will need ways to see how reviewers are performing and (at times) add/remove reviewers to complete an assessment.  Coordinators can do this from the Review Summary, which lives within the Reports section of the assessment.  This view shows the reviewers associated with the assessment, the number of reviews assigned to each reviewer and the number of reviews completed by each reviewer.  

Redistributing reviews

If there are changes made to the review team or the pending reviews need to be rebalanced across the active reviewers, the coordinator can click on the ‘Redistribute Reviews’ button.  This will evenly re-distribute the pending reviews between all of the active reviewers. 

Note:  Redistributing reviews is only applicable in pre-assigned review strategies.  

The next step is to begin Reviewing Work. Learn more HERE.

Did this answer your question?