The PBS Endorsement Process
PBS endorsement is received by candidates who submit a portfolio of their work. Portfolio reviewers, also known as the endorsement board members, are required to submit an application to establish their expertise and involvement in the field. The link to the application can be found here. Requirements that are evaluated include PBS endorsement for a minimum of 2 years, having taken Person Centered Thinking within the past 3 years, and being active in the PBS Facilitator community (either through past mentoring, supervising, teaching PBS, and/or participation in state or national PBS Committees). Each applicant must provide a recommendation from a peer-endorsed PBS Facilitator, not related to them by blood or marriage, or from a subordinate in a supervisor/employee relationship, who has witnessed their skills in Positive Behavior Support, and one from someone who is familiar with their work. The applications are reviewed by the PBS Advisory Board which meets quarterly. Endorsement Board Members are hired on an as needed basis and paid for their work.
Interrater Agreement of Scoring Process by Endorsement Board Members
Interrater agreement between endorsement board members (see Gisev et al., 2013) will be conducted with each portfolio review. Using an independent and identity hidden review process, each of three endorsement board members will score the same PBS portfolio using a form that has been formatted in Research Electronic Data Capture (REDCap). Then, each endorsement board reviewer will submit their form in REDCap, providing ratings for each item. Once all of the endorsement board members have submitted their ratings, a summary form will be produced for the portfolio review that provides percentage agreement per question and an overall interrater reliability score calculated as the average percentage of agreement across all questions. Individual item calculations are computed using a series of if/then statements checking if each rater's value agrees with each other rater, assigning an agreement as a '1' or a disagreement as a '0', which is then calculated as the number of agreements divided by the total pairs of raters being compared (in this case, Rater #1 vs. Rater #2, Rater #1 vs. Rater #3, Rater #2 vs. Rater #3). This method allows the project team to examine whether one or more specific questions are outliers as far as agreement and would also provide an overall reliability score for all items. The full interrater agreement score is calculated as the mean of the percent of agreement for all included items.
Example of calculations:
Rater #1 |
Rater #2 |
Rater #3 |
% Agreement |
1 |
1 |
0 |
33.3% |
1 |
1 |
1 |
100.0% |
0 |
1 |
0 |
0.0% |
|
|
IRR |
44.4% |
Training for Portfolio Review and Interrater Agreement
Prior to their first portfolio review, all existing PBS Board members will meet together and, using the revised PBS plan reviewer checklist, evaluate a portfolio that has been previously submitted. Through this process, they will have an opportunity to openly ask questions and seek clarification on any item. Then, using a different portfolio, the endorsement board members will blindly complete the score sheet. The group will then reconvene to discuss any discrepancies between scores and reliability will be calculated.
If there is less than 85% interrater agreement between endorsement board members, further training will be scheduled and the process will repeat until the reliability score of at least 85% is met. If the interrater agreement between scorers fails to meet the 85% criterion, the training process described above will be repeated until the requirement is met.
Any new reviewer will be trained by project staff at the Partnership for People with Disabilities to a criterion of a minimum of 85% agreement with another reviewer across 3 portfolio reviews