Home
Dr. Provenzano
Counseling & Psychotherapy Services
Gifted Assesment
Comprehensive Evaluation
Fees & Insurance
Practice Forms
Articles by Dr. Provenzano
Resources
Contact Us
   
 

Assessment of Giftedness

We provide assessments for qualification for gifted programs for both private schools such as The Evergreen School, Seattle Country Day School, Open Window School, and University Child Development School (UCDS), as well as the Seattle Public School’s Highly Capable Program and other public school programs. 

For intellectual assessments:

·         the Wechsler scales (WPPSI-IV, WISC-V, and WAIS-IV)  
·         the Stanford-Binet, 5th Ed. (SB-V)

For reading and math achievement testing:

·         the Wechsler Individual Achievement Test, 3rd Ed. (WIAT-III)
·         the Woodcock-Johnson Psychoeducational Battery, 3rd and 4th Ed. (WJ III                     and WJ-IV). 

Appointments for assessment of giftedness are typically two hours in length.  They involve 5-10 minutes of getting acquainted, 80-90 minutes of testing (with a 5-10 minute break about 50 minutes into the session), and then finishing with 20-30 minutes to discuss the results with the parents.  This discussion also addresses options for school placement.  The academic testing assessment follows a similar schedule but may be from one to two hours in length, depending on the age and work pace of the child. You are welcome to bring a snack for your child’s break.  Assessments are only scheduled in the mornings, when your child is fresher and we can get a more valid measure of intellectual aptitude.

PLEASE NOTE:

A Message about the recent changes in Seattle School District’s evaluation of assessments performed by independent licensed psychologists used in appealing decisions for placement in Highly Capable Programs.  

In late October or early November of this year, the Seattle School District amended their procedures for appealing qualification decisions for the district’s Highly Capable programs.  One of the changes involves setting a much higher standard for qualifying scores on tests administered by independent psychologists versus qualifying scores from tests administered by district examiners. These current rules indicate that students can continue to qualify for the Highly Capable Cohort with scores at or above the 98th percentile on a test of cognitive abilities and 95th percentile on tests of reading and math achievement if they are administered by school district examiners, but the vague and contradictory bar for test scores administered by independent psychologists is apparently at or somewhere above the 99th percentile for all three domains: cognitive abilities, reading, and math.  

 While the Seattle School District has the authority to establish the admissions qualifications for their programs, I cannot identify a rationale for this policy that has any basis in psychometric principles or applications.  In discussing this policy with my colleagues, they voice similar perplexity about the logical basis for this policy revision.  My colleague and office-mate, Amy Summers, Ph.D., has crafted a considered and scholarly review of many of the issues related to this policy revision.  I do not need to re-create the wheel (especially as I don’t think I could do it as well!), so let me refer you to her review at www.dramysummers.com.  I will just add a couple of points to her list of myths 1-4 regarding the practice of independent testing by licensed psychologists.    

 Myth #5:  Group-administered tests are just as valid as individually administered tests.

When you measure children’s height, you know exactly how tall they are. This is a direct measure. Psychological and educational tests are not direct measures; their usefulness comes from their ability to predict other behaviors such as success in accelerated and enriched programs.  To do this effectively, the tests need to control other variables that might interfere with the predictive value of the tests.  Other variables might include testing conditions that distract the students’ concentration in the test or inhibit their responding; the training and experience of the examiner; and, the ability to ensure that all students taking the test have equal opportunity to understand and respond to the questions.

 In tests that are administered in a group format (such as the CogAT), the potential and probability for confounding variables increases because the students interact and impact each other.  For example, a number of parents have told me that their children (especially compassionate younger children) have told them on exiting the CogAT testing that they didn’t answer all of the questions because they wanted to give other students a chance. In addition, the group format means that the examiner has to divide their attention between the students and so has less opportunity to observe and address the validity of the test administration in regard to each of the students.   

 Myth #6:  Independent psychologists are paid by the parents, and so just say what the parents want to hear. 

This idea has been asserted by a number of educational professionals who should know better, including at least one former school district superintendent.  We have always found this assertion to be difficult to understand.  Our fees are based on providing a service (in this case, the valid administration and interpretation of assessments), and are not change based on the outcomes.  In fact, we would consider it a disservice to the students we assess and to their parents to falsify or boost scores, as this might contribute to the students being placed in accelerated programs that they cannot handle.  This would hurt both their learning and their self-esteem. 

This assertion about the questionable validity of independent assessment results is even more confusing given the fact that the school district’s examiners may actually be impacted by more varied pressures than the independent psychologists.  Where the independent psychologist’s purpose is to provide information to serve the appropriate needs of the student, the district examiners who administer the CogAT are employees and agents of the school district.  They also are expected to respond to district policies and expectations.  If the district has an interest in capping the number of students who qualify for Highly Capable programs or has other objectives, these may be in conflict with a primary focus on serving the student as the client.

When using data such as testing results to support decision making, two types of errors may occur.  A Type 1 error is a “false positive:” including a subject in the group that does not truly meet the criteria.  Inclusion in a highly capable program based on a Type 1 error would not be in a student’s best interests because it poses a risk for struggling to understand and keep up and can contribute to low self-esteem.  A Type 2 error is a “false negative,” resulting in excluding a subject who truly should qualify for the program.  This type of error would also not be in the student’s best interest as it would deprive him or her of the higher level of intellectual and academic challenge that matches the student’s capabilities and prior achievement levels.  However, Type 2 errors may serve the decision-making body if the intention is to cap or reduce the number of applicants allowed into the program.   

Please note: Seattle School District requires that independent examiners submit a copy of their license with the reports of appeals testing.  This means that, for the most part, this testing will be performed by licensed psychologists.  We have earned a doctoral degree in psychology, completed extensive internships, and passed rigorous written and oral examinations to earn our licenses. We are bound to comply with the laws regarding our license and the strict code of ethics developed by the American Psychological Association.  If we are state or nationally certified school psychologists, we are also bound by the Washington Code of Educator Conduct and/or the Principles of Professional Ethics of the National Association of School Psychologists.  

Revised 11/29/17