Populations Students Early Career Families Educators View My Account
Skip Navigation LinksNASP Home Publications Communiqué Volume 39, Issue 6 Upgrading to SP 4.0

Upgrading to SP 4.0

By Brian P. Leung

As I prepared to upgrade my computer operating system software to Windows 7, I began to think about my job of upgrading the software (a.k.a. my students) for their future work as school psychologists. Call me a computer geek but I think I see many parallels. With apologies to Microsoft… It has always been my desire to prepare my students to be the latest and most versatile version of professional school psychologists. I will call them SP 4.0. But while I strive to install expanded and new features in SP 4.0, they have often encountered difficulties while trying to run at full capacity during field-testing. Many of my interns (and grads) quickly discover that they are operating within environments that are archaic, containing outdated software, fragmented hardware structures, and alongside earlier versions of SP! The danger of installing new software in old operating environments is well documented: high probability of software and hardware conflicts. These conflicts can render the new software nonoperational, or at least degrade the full features of the new software. Newer software has to reconfigure so much in order to work smoothly that soon, the new software become indistinguishable from the old software.

Here are some descriptions of SP versions that I (and my students) have encountered and potential areas for conflict.

SP Version Assessment Intervention Perceived
From Santa
1.0 Best tester in town!
“Standard” battery for all
students regardless of
referral question.
Full battery for triennials,
even high school students.
Strong reliance on test
profiles and subscores for
Best leave it all to teachers
(who are the best trained,
after all).
Occasional counseling and
SST attendance.
Stays mostly in office doing
psychologist work.
Known mostly by SpEd
Self (likes to
do what’s
2.0 Second best tester in town.
Full battery triennials.
Uses RIOT but not for
Believes that more testing
will save job.
Regular counseling caseload.
SST attendance.
Ventures out to playground,
mostly for observations of
referred students.
Likewise to teacher lounges,
to just eat.
Known by front office and
teachers who have referred
Self and
about lack of
time to do
“more” (but no
real risks/
attempts to
actually do
3.0 Shortened test batteries,
mostly to save time.
More use of the MDT.
Uses multiple data sources
(RIOT) for diagnosis.
Counseling both GenEd and
SpEd students.
Teacher consultation
Occasional parent workshop.
Hardly in office.
Known by most on campus
programs for
4.0 Selective use of tests based
on referral questions.
Consults as a way out of
testing referrals.
Typical no-test triennials.
Use of RIOT for crossvalidation.
MDT or thematic team
Collect data for program
Participates in schoolbased
direct interventions.
Looks for school-wide
interventions (e.g., school
climate, teacher support,
family–school partnerships).
Collects data to assist with
intervention adjustments.
Seldom in office.
Known by everyone on
School (adult
and children;
policies and
School to buy
more days,
in order to
do more for
adults and

I stress to my students that earlier SP versions are no less hard-working, dedicated, or less focused on their job, but they were developed in an earlier time when the developers had narrow purposes for the software (e.g., special education placement or bust). Early versions tended to have fewer features and untested ability to communicate across platforms. Moreover, multitasking and multithinking is typically not expected of older software because the users (clients) are not accustomed to expecting such robust features, and never ask for anything more. Newer software will require more RAM memory or at least more efficient use of existing memory, and regular upgrades are considered a routine part of running all applications. These are notions often difficult for older software to integrate.

All kidding aside, the optimal integration of newly graduated school psychologists with their veteran colleagues is probably one that has brought tears and anguish to both sides. With each graduating class, I instill a broader vision of school psychology to make a bigger impact, but I often share the students’ struggles to gain traction to get to the big picture … without getting degraded by unending referrals for testing and/or by being evaluated by the size of one’s test battery or number of SpEd placements. I would enjoy ongoing conversations with both trainers and practitioners on strategies to move our profession forward together to upgrade the entire system, so that all software can function maximally for the benefit of everyone at school—adults and students!

Brian P. Leung, PhD, is the director of the school psychology program at Loyola Marymount University and has served as a school psychologist in both urban and suburban districts.