Deaf Students in Scottish Higher Education
Chapter Seven: The Assessment Process
- Shortage of assessors
- Variability in assessors and assessments
- Timing of assessments
- Quality indicators
- What does assessment mean?
- Assessment proformas
- Changing requirements of students
- Costs of assessment
- Key points
In order to qualify for Disability Student Allowance (DSA), students must provide evidence of their ‘need’ for financial support. This usually takes the form of a ‘needs assessment’ carried out by an assessor. Most take place in the four Access Centres in Scotland. Currently only one Access Centre is located within an HEI: three out of four Access Centres are located in an FEC. There are also a number of individual assessors, outwith the Access Centres, whose assessments are accepted by SAAS.
While there were examples of assessments having been undertaken which were satisfactory for all concerned, there were three main problematic issues identified in the current system: shortage of assessors, variability in assessors and assessments and timing of assessments.
Shortage of assessors
The four Access Centres have proved to be insufficient to cope with the increased demand from student applicants. Even with the addition of some additional SAAS-approved assessors, who operate independently outwith Access centres, there is a general shortage:
"The lack of assessors to carry out assessments of need for DSA
is a significant barrier which stops students receiving the help
they need as soon as they need it.
(Skill Scotland 2003, p3)
A backlog of assessments can build up at key times, leading to critical delays in DSA being awarded. One Access Centre manager comments that: “ninety-nine per cent of DSA assessments at the Access Centre are done after the course starts”. However, this manager also explains that if the student is moving from FE to HE within the same college, the assessment would be done before the summer.
"It’s wrong that there is this delay. Assessment support should be assessed and strategies put in place before the summer. DSA is the problem. If HEIs were funded to provide support this would enable the process to be improved. Any deaf student who commences the course without appropriate support strategies is immediately at a disadvantage to their peers because they cannot access information."
A freelance assessor comments:
"When I was doing quite a lot [of assessments] I always got stragglers once the university course had started, round about Christmas etc. And they were struggling. And they would phone in a panic. But I always left it, you know, open until March because there was always a few."
Thus a student may be at the end of the second term before an assessment is carried out: this may mean that the student has been completely without appropriate access support. Skill Scotland also draw attention to the problems of delay: “disabled people need support and funding in place when they start a course to ensure that they are not placed at a disadvantage” (Skill Scotland, 2003, p.1).
Variability in assessors and assessments
Assessment within an Access Centre does not automatically mean that someone with qualifications and experience in relation to deaf people will necessarily carry out the assessment. In fact, there is considerable variation in the background qualifications and skills among assessors.
The current Access Centres range from one which is located in an institution which has a strong tradition of working with deaf students and therefore has built up strong expertise in this area, to one which sees relatively few deaf students. In some cases, the extent of contact with deaf students has enabled assessors and staff of Access Centres to be very aware of deaf people’s requirements. However, it seems to depend very much on the particular situation of individuals, the enthusiasm and goodwill of specific staff and their particular experiences in terms of the type of deaf people they have worked with. For example, some assessors have never assessed a BSL-using deaf person, while others have familiarity with this group.
There was variation also in the willingness to use external agencies when staff did not have relevant expertise.
One Manager of an Access Centre which had only dealt with two or three profoundly deaf students over the last seven to eight years said that the Centre did not use an external agency for specialist expertise and did not imagine there would be a need to consider this. There was an expectation that deaf students would be the same as others in terms of negotiating support. However, it is difficult to see how the deaf student could obtain the assessment and access support required if no external agencies are used, when staff do not have deaf-oriented skills and knowledge.
A more open attitude was demonstrated by another Access Centre assessor, who stated that it is important:
"…not only to know what you do know, but to know what you don’t know and to make appropriate referral for additional advice or for other sources of expertise. It’s not a one-size-fitsall. It’s not, you know, a suggestion that as a consequence of becoming skilled in practical assessment that you could automatically the next day take up every component … like one day you might be an obstetrician and, you know, the next day you might be an orthopaedic surgeon. We’re not suggesting that at all."
Some centres make use of presentations and consultations with commercial firms, eg; those dispensing hearing aids and audiological equipment. However, as suggested on the SAAS website pages, there is some danger that commercial representatives may not have the objective stance that an independent advisor could take. Other Centres which are more familiar with the range of language requirements of deaf people are more likely only to bring in commercial personnel as an adjunct to a range of support information.
Variation in service provided is also true of SAAS-approved freelance assessors of deaf students. One explained that her background was in audiology and the fitting of hearing aids. This assessor, therefore, automatically sought out detailed information relating to the level and type of hearing loss and would even contact Audiology clinics to obtain this information. Other assessors would not consider such information to be crucial. Indeed, one assessor indicated that she did not understand the audiological dimension very well and would need to ‘seek further advice’.
Disability Advisors and staff of Access Centres themselves expressed concern at the variable practice in terms of both the nature of assessment and the personnel involved in such assessment. There was particular concern that referrals for needs assessments may be made by SAAS and LEAs on an ad hoc basis:
"I feel discomfort with the idea that there is this ad hoc list … ’Well they’ve always done assessments, so we will continue to accept theirs’ … There’s a real sense of fragility and inconsistency with current ways in which people are either deemed to be appropriate for assessment or not."
We thus have a wide range of individuals carrying out assessments with more or less specialist knowledge and skills. Clearly, in order to ensure fair assessments for all, there needs to be some specification of appropriate expertise. Otherwise the access provision may reflect the assessor’s individual knowledge base, rather than the requirements and preferences of the students. This will be further expanded later in the chapter.
A related issue is the fact that there is evidence of variability in the assessment process itself. This can be seen in both the accounts that were given by assessors—and students—and in the written records and assessment documents produced. These ranged from quite complex highly detailed proformas to very brief letters which included a summary of key points. There were no proformas shown to the researchers which provided particular headings or sections relating specifically to the access requirements of deaf people.
A number of assessors have made use of the proformas developed by the National Federation of Access Centres. However, this is seen by a number of assessors as too detailed to be helpful: because it tries to cover a range of eventualities, it often includes information not relevant to the particular students:
"in our system, if a student comes and there’s nothing relevant
to transport, I just don’t do anything about transport. Whereas
with that format you absolutely have to comment on transport,
placement whatever. Even if you are only saying it’s not
relevant. It makes for a very long report. And I feel students
don’t have really the time to spend on reading anything as long
as that. So we have gone with just sending in a shorter, quite a
narrative report about the student."
(Access Centre assessor)
Timing of assessments
When the needs assessment is not conducted in the particular HEI a student wishes to attend, there may be very limited information available — or even no information — about the course of study which the student will be undertaking. Therefore, it may not be easy to ensure that access arrangements fit directly with the demands of the course. As one Access Centre assessor describes:
"Now you’d want a student to be equipped before they start. But then it’s ‘catch 22’. How do they know what the course is going to ask of them? And that’s where our method of assessment or our requirement of assessment is very unsatisfactory because we see the student once. If you are really doing a good job of assessment, you would want to go in and say, ‘We’ll try this’ and then set them up with something and then go back after three weeks ‘How’s that going? Do you need something else? Is that working? Do you need something different?’. And we can’t do that with our form of assessment. And that’s why it would be better if students are assessed in their own college, lent equipment and then ask SAAS for funding for it. So it’s not a good way of doing it to be honest. It’s the best we’ve got but it’s not the ideal."
Thus we have a situation where there is a shortage of and variability among assessors and assessments, and a ‘catch 22’ in assessment timing.
In order to address some of these issues in relation to the assessment of disabled students, the BRITE Centre initiated a set of quality indicators for the assessment process. The initiative aims to build the capacity of individual institutions to carry out these assessments, which will reduce the pressure on Access Centres — and the consequent delays for individual students. It also allows for specific requirements of the course to be taken into account in an ongoing process rather than a one-off interview. This issue is further expanded later in the chapter.
The ongoing development of a ‘Toolkit’ to improve the quality and consistency of the assessment process is being taken forward by the Disabled Students Stakeholder Group, under the auspices of SAAS.
The Director of the BRITE Centre, Alison Cox (Cox, 2003), has stressed the importance of ensuring that the purpose of the assessment is agreed and that all stakeholders have confidence in the process: this involves establishing clear criteria, having clear standards in place and putting into place a means of auditing the process.
The Initiative gives recognition to some of the real and potential problems within the assessment process. It aims to establish a level playing field, so that all stakeholders involved can know that all of the people undertaking the assessments have the appropriate qualifications and skills. It also acknowledges the importance of having some type of quality assurance of assessors and the whole assessment process.
The Scottish Executive Disabled Students Stakeholder Group has adopted elements of the BRITE Initiative’s original quality indicators in the formation of ‘The Toolkit for Needs Assessment’.
But will the Toolkit alone be sufficient to guarantee that deaf students will receive a consistently comprehensive, relevant and effective DSA assessment in future?
What does assessment mean?
The training for assessors offered through the BRITE Initiative sets out five distinct types of assessment. But, on a fundamental level, the term ‘assessment’ is itself ambiguous. For those deaf persons who are fully aware of the types of access provision that suit them, an assessment may not be necessary or appropriate, when they primarily need an opportunity to share this information with someone who is able to help co-ordinate the relevant provision.
Thus an interview may well be appropriate. However, because of the funding arrangements (Disabled Students’ Allowance), the initial interview is located within a disability framework, incorporating an individual needs assessment. However much the individual assessor tries to move away from a medical model of disability and deafness, there is a sense in which this is built into the system. This notion can perhaps best be seen in relation to those deaf people who are said to ‘need’ ‘signers’. Both terms are placed in inverted commas here because both indicate a level of misunderstanding. One university runs modules in which deaf people provide some of the lectures and seminars. In these cases, a BSL/English interpreter is provided so that people who do not share the presenters’ first language, that is, primarily hearing people, can follow the lecture. When a deaf student makes a contribution to a seminar using BSL, the interpreter is present as much for the teaching staff and fellow students as for the deaf student. Describing the interpreter as a ‘signer’ ignores the reality of interpreting as a two-way process and also ignores the ‘needs’ of the teaching staff and hearing students.
In addition to raising a fundamental reservation about the concept of ‘needs assessment’, the latter example demonstrates one of the many complexities of deaf students’ situations. How far the Toolkit as it stands can address this kind of complexity will now be explored.
The Toolkit undoubtedly provides a positive general framework as described above. However, important issues relating to the linguistic nature of deaf student access are not included. For example, the requirement that ‘assessors have relevant qualifications and experience’ is undoubtedly vital, but it is unclear what this means in relation to the assessment of deaf students. One could have had experience of working with deaf students who use amplification and/or note-takers to access information, but no experience of students who use BSL (and the two-way nature of BSL/English interpreting as described above) — or vice versa. One may have had some experience, but no understanding of why many deaf students have an issue with English language — and so on.
Furthermore, given the fact that many deaf DSA applicants are unaware of the range of access strategies available (see ‘Changing requirements of students’ later in this chapter), there is a need for assessors to keep upto- date with the full range of strategies and services and the qualifications relevant to staff who offer those services. Of course it would be inappropriate to expect assessors themselves to have qualifications in note-taking, lipreading or BSL/English interpreting. However, it is essential for assessors to know that qualifications exist in these areas and to know what they are and where qualified staff can be found.
"If anything, the qualifications and the variability and so on in
relation to the deaf population is perhaps more highlighted than
in any other population in that, you know, there are so many
(Access Centre Assessor)
The Toolkit includes a revised draft proforma which is suggested for structuring the assessor’s report. It is a generic proforma which, while leaner than the Access Centre version described earlier, covers all applicants for DSA. Most of the Access Centre staff interviewed for this study said that they adapted their original proforma when interviewing deaf applicants. Nevertheless, continued absence of deaf specific indicators leads to the possibility that, even though exemplification is sometimes deaf student-related, assessors who are unfamiliar with a particular type of access provision will remain in the dark when using this form alone, in that they will not be alerted to the full potential of provision, eg; electronic notetaking, lipspeaking etc.
The UK-wide National Association for Tertiary Education for Deaf People (NATED) has developed a specific Assessment Pack for use with deaf or hard of hearing students. The pack was primarily designed for use within FE situations, but is also adaptable to HE. It is of interest because it further illustrates the complexities involved in access requirements for deaf people. Additionally, as will be expanded upon later, it addresses the need for assessment to be seen as a process, as also advocated by the BRITE Initiative and others, rather than a one-off event.
Listing the different elements with brief explanatory notes requires four A4 sheets in the Assessment Pack. The initial assessment, for example, includes the following headings: Learning Style; Audiological Information; Assessment Results; Reading; Written Work; Numeracy; Study Skills; Technology Skills; Assessment of Language Skills; Communication Skills; Communication Mode Used; Sign Production; Sign Reception; Fingerspelling Production; Fingerspelling Reception; Lipreading/Speech Perception; Speech Production; Anticipated Support Needs. Given some of the assessment tools mentioned or recommended (eg; the Test for Reception of Grammar, TROG, which relates to English), the amount of time involved for any one student would be very considerable indeed. However, the detail of the NATED pack demonstrates that there is an awareness of the range of linguistic requirements of the potential student intake.
At the same time, the pack raises issues about whose role it is to assess the students’ skills in the different areas concerned.
Certainly, it is common for HEIs to specify literacy requirements – and sometimes numeracy requirements – for admission to undergraduate and postgraduate courses of study. Most hearing learners coming from school have formal qualifications to indicate their levels of English and Maths, but deaf learners may not have these, despite having an appropriate level of ability in the subject applied for. NATED would argue that a literacy test enables assessment of the level and nature of any additional language tutorial support needed and/or additional access services – such as interpretation and/or modification of course materials. But, at HE rather than FE level, it could be argued that a negotiation with a student about their linguistic access requirements would be more appropriate than an assessment or ‘test’.
Again, this raises some fundamental issues of principle which cannot be discussed at any length in this report. In any case, any views which might be presented on this matter would not necessarily relate to the views and experiences of those interviewed during the research (the NATED pack was not explicitly discussed in the interviews). What we can say is that, although in a very few cases quite detailed reports were carried out on the student’s requirements at initial entry, we saw nothing of the scale suggested within the NATED documentation.
The Toolkit’s suggested proforma deals with written language in as much as it includes 'producing written work' and 'reading printed material' as 'effects of disability on study', but there is no guarantee that the assessor would understand why a bright deaf student might have issues with written English, and the assessor would therefore not be in a position to know in what circumstances and how best to recommend specialist tutorial support.
A further issue is that prospective NATED assessors are required to go through a registration process in order to ensure that they have sufficient background knowledge and skills to be able to provide a professional service. While the Toolkit requires a quality assurance process, this is based on the generic indicators. Thus an HEI could be validated as an assessment centre, but still not have specific validation in relation to specialised expertise needed to offer a high quality service to deaf applicants. It is clear that only a very few of those assessors interviewed for this study would be eligible for registration as a NATED assessor. It is unlikely that the Toolkit will alter this situation.
NATED itself is not very visible within Scotland. However, one member of the Research Team and one member of the Consultation Team were members and, in fact, approved assessors for NATED. Their contributions suggested that the level of exchange and interaction encouraged by NATED in relation to deaf people within FE and HE was also needed in Scotland. Such interaction may be facilitated by finding ways of encouraging more Scottish personnel to join NATED, or by establishing additional networks, possibly through the proposed Linguistic Access Centre (see Chapter Thirteen).
Thus, while it is likely that not all of the NATED pack would be appropriate, it does address the complexity of the assessment requirements of deaf students in a way not possible for the Toolkit. It also addresses the likelihood that the requirements of students will change from those predicted at an interview prior to the start of the course.
Changing requirements of students
The NATED assessment process involves an initial assessment; an onentry assessment; a review; an ‘early exit’ assessment, if appropriate; and an ‘exit’ assessment which takes place in the final term. This ‘exit’ assessment is seen as crucial for both the individual student and for auditing the overall quality of the provision. It is a concept which does not seem to be have been implemented regularly in Scotland, for reasons explained earlier, although the devolving of DSA assessments to institutions should mean that that there is more potential for this level of ongoing review.
The particular need for an ongoing review of deaf students' access requirements is born out by evidence that these requirements are quite likely to change over time. Such change may be linked to their own developing familiarity with and evolving preferences for specific types of access, the changing demands of the courses, and the development of their own preferred access strategies:
"It’s important to recognise that students’ own strategies
will develop, so support needs to change. The student has to be
able to move with that, as does the institution. Ongoing review
(FE Disability Adviser)
A further recurring theme from both staff and students is that many deaf students are unfamiliar with the range of access provision which may be appropriate to them. For these individuals, the opportunity to explore strategies with assessors is likely to be helpful in identifying suitable provision. This is particularly the case for students who come directly from schools: those coming from FE colleges have usually had the opportunity to familiarise themselves with certain types of strategy, although this is not universally the case (see also under Transition Issues). Disability staff suggest that the educational experiences of deaf students may have been narrower than those of other students, and the amount of information they have access to may also be more limited:
"[The students] have strategies which they have used in school, perhaps, but these strategies are not usually transferable into the larger environment of a college or particularly a university where the learning and teaching is of a different type… students have to cope on their own, they have to cope in large numbers. In college we mostly have small groups of between fifteen and twenty-five—some classes are thirty. That is still often larger than what students are used to. So it is difficult for them to transfer those [strategies] from school to college."
There is recognition that students themselves might need training in order to benefit from access arrangements:
"Because often when the students come they are not used to using support. Many of the students have not worked with an interpreter at all. They have been in a school and they have had a teacher for the deaf. But they have not had an interpreter, so they don’t understand that role. They have got to learn how to work with a communication support worker. They have got to learn how to interact with the lecturers and what is expected of them and what they can't do … If they are going on to higher education, we have to try and wean them — try and develop that support so that it is more appropriate for university and so that they are more independent — so they can ask the questions themselves. It is great to see that happening actually. The younger ones get the confidence and it is great for the ones that don’t go on to university as well. They might go on to a vocational programme or to get apprenticeship or something like that. And you see that confidence and you see that person developing over the year or two that they are with us."
Here it is clear that the FE environment, in which different types of access provision are visible and available, provides a nurturing environment. However, some staff do suggest that students are able to adapt quickly, even when they are not used to a particular service. This seems to apply in those situations where there is a stronger deaf presence, eg; within particular FE contexts where new students are able to see other students using types of access they may not have used before.
The whole assessment process needs to be re-examined in the light of the specific linguistic requirements of deaf students. The current variation in types of assessment, personnel involved and qualifications of assessors must be addressed if students are to be given appropriate access within HE. It would therefore seem appropriate to establish clear and thorough deaf-specific quality indicators — and external organisation contacts — which could supplement the Toolkit indicators when a deaf student is involved.
It is suggested here that, in addition to this, there needs to be some mechanism for strengthening the work carried out by assessors, whether located in an Access Centre or not. There needs to be what was described within the consultation group as a ‘powerhouse’ which can provide a high level of support and training both to those engaged in assessment and those involved in access provision (see Chapter Thirteen).
In the meantime, a working group of key staff from FECs, HEIs and other relevant organisations has recently been formed with the aim of ‘working towards best practice in linguistic access for d/Deaf students’. This would seem like a good forum for developing the content of specific indicators.
Costs of assessment
Several contributors made reference to the costs of assessment, a point also raised by Skill Scotland in their submission to the Funding for Learners Review:
"When students discover that it is likely that they have a
disability, they need confirming evidence from certain qualified
people or organisations before they can claim DSA. For
example, when it is likely that an individual has dyslexia, they
must get a report to confirm this from an Educational
Psychologist. Such a report often costs around £200, and the
cost can stop a student from claiming DSA or getting the
support which they need."
(Skill Scotland 2003, p4)
Interestingly the SAAS website refers users to the RNID as one possible source of assessments, even though currently RNID are not providing this service. For confirming a diagnosis of dyslexia, independent educational psychologists may charge between £200 and £300. No data is available on the costs, if any, to a deaf student of ‘proving’ their deafness. This is a separate cost from a full needs assessment which, according to the Toolkit, can be as high as £375. It seems therefore appropriate to support Skill Scotland’s recommendation that “the Scottish Executive consider means of funding these costs, and directing universities and colleges accordingly” (Skill Scotland 2003, p4).
- There is a shortage of DSA-approved assessors and consequent backlog of assessments, leading to damaging delays.
- There is variability in knowledge and skills of assessors who assess deaf students and in the assessments themselves.
- There exists a 'catch 22' situation in the timing of assessments: late assessments mean students can end up waiting as long as a term (or even longer) for access services to be established, but assessments undertaken early mean that the characteristics of the course itself cannot be adequately included.
- A 'Toolkit' of quality indicators has been produced which addresses the above issues by offering a quality-assured framework which can be used to build capacity of HEIs to deliver assessments. However, the Toolkit continues to operate within a 'needs assessment' framework, and the Toolkit does not adequately take into account the complexities of the linguistic access situation for deaf students, including the range of strategies and relevant qualifications of access staff.
- The National Association for Tertiary Education for Deaf Students (NATED) has an assessment pack which addresses the complexities of the linguistic access situation, and requires assessors to register in order to use it. At least some of the pack would be useful in devising supplementary indicators.
- It is likely that access requirements of deaf students will change from those predicted at the beginning of the course of study. Therefore it is particularly useful to see DSA assessment as part of a process, rather than a one-off event – the NATED pack, and the guidelines from the BRITE Initiative, take this into account.
- A working party has recently been formed to take forward issues related to standardising quality assurance in access services for deaf students in HE/FE. There is potential for this group to address supplementary quality indicators, in co-operation with NATED, BRITE and others.
- Students sometimes have to pay for assessments, which could put them off from applying for DSA.
7.1 Clarification of access requirements should be made as early as possible and initial provision established by the beginning of the course.
7.2 A supplementary set of quality indicators and exemplars needs to be developed for DSA assessment of deaf students, in co-operation with BRITE, and a procedure established for their implementation. The format would include adequate reference to language choices and the broad range of linguistic access strategies available. The NATED pack could be used as a starting point for development, and the NATED network fully exploited.
7.3 Clarification of access requirements should be seen as an ongoing process and individuals should have the right to opt in and out of different types of access depending on their experiences of the usefulness of the provision. The assessment stages built into the NATED pack could be exploited, in addition to examples of good practice identified in some institutions.
7.4 Assessors for deaf people should be appropriately trained and accredited, co-ordinated centrally by a Centre for Linguistic Access. This training/accreditation process would be progressive and ongoing. The Centre would also provide a focal networking point for all stakeholders with an interest in assessment of deaf students, including deaf students themselves.
7.5 In the meanwhile, the ‘Working Towards Best Practice in Linguistic Access for d/Deaf Students’ group should be consulted about developing the supplementary indicators and exemplars, in cooperation with NATED and BRITE. It is understood that the group is interested in expanding NATED membership in Scotland.
7.6 Students should never have to pay for assessments themselves. The Scottish Executive should consider establishing a system for bearing costs where they are not already met, so that students are not financially deterred from applying.