Wales Council of the Blind

Wales Council of the Blind logo Wales Council of the Blind logo

Benchmarking Conference 2008.

Second Annual V.I. Benchmarking Conference 2008.

Survey of achievement of progress on the Benchmarking standards

Contents:

  1. Background.
  2. The response.
  3. The data with Comments and Recommendations.
  4. Discussion.

1. Background

The original proposal to apply the benchmarking method to services to vision impaired people was advocated initially by WCB and supported by ADSS Cymru, by CSSIW, by SSIA and by all 22 Social Services departments. A considerable amount of work by service providers, both managers and frontline workers, from both statutory and voluntary agencies, resulted in the launch of the Benchmarking Report and the Good Practice Guidelines in April 2006.

This was followed by regional workshops to help social services utilise the Guidelines to improve their services. A Benchmarking Implementation Group was then set up to monitor how the Benchmarking process was helping local Social Services plan improvements.

At the Benchmarking conference in July 2007, a Performance Monitoring Process developed by Wrecsam Social Services was presented and circulated. This work was welcomed by the conference and it was agreed that this Process would be a useful tool in carrying out a survey to measure progress so far and to up-date some of the data first gathered for the original Report. In planning the second Benchmarking conference, the Implementation Group felt that it would be useful to carry out an up-dating survey and decided to use the Wrecsam document as a basis for the questionnaire.

2. The response

The questionnaire was circulated to those who had been identified as the official V. I. Benchmarking contacts in all Social Services departments. It was made clear that the replies would be treated as confidential and the summary report would not name authorities except where there were examples of good practice. The closing date was Tuesday, 13th May, and reminders were sent out three times offering an extended deadline. Phone calls were made to those who had not submitted by the extended deadline. Of the twenty-two authorities, fifteen responses were received.

Where a question was not replied to and no alternative was offered in a text reply, the response has been counted as a negative.

It is not possible to know what is happening in the authorities that did not respond. Whether this indicates a lack of interest in benchmarking the v.i. service or some other reason, it cannot be satisfactory to leave gaps in the cross-Wales data and possibly a lack of progress in improving standards.

3. The data with Comments and Recommendations

Below is a summary of the responses to the questionnaire. This is the basic quantifiable data. There were in addition many text comments added to the replies and Question 16 produced a mass of informative comment.

I have analysed the responses according to local authority and according to question and offer comments and recommendations under each heading.

1. Are you using the Benchmarking Performance Monitoring Process to measure your performance in your services to sight impaired people in your area?

Yes 8, No 7.

As this process was intended to assist the implementation of the good practice standards, It is important to distinguish between the use of the Monitoring Process and the use of the Benchmarking standards themselves. Although eight authorities are using this monitoring process, eleven authorities seem to be using the Good Practice Guidelines but to use other methods of monitoring performance.

2. What method are you using to measure your performance in services to sight impaired people in your area?

Of the seven who had not opted to use this Monitoring Process, one was about to re-configure services and would thereafter use this Monitoring Process.
Two had developed or were developing their own methods of monitoring.
One was using the authority's Corporate Business Planning Toolkit.
One was using a mixture of reports, supervision and feedback from service users.
Two authorities had no specific method of measuring performance.

Comments: All authorities using the same method would mean that it is easier to facilitate comparison of performance. In this situation the other methods being used invite examination of their suitability for v.i. services.

Recommendation: Examine the efficacy of the other methods used for monitoring v.i. services.

3. Have you set up a VI Joint Working Group to review & monitor Good Practice Guides?

Yes 6, No 9.

Only two authorities of the nine who had not set up Joint Working Groups explained their alternative methods and these consisted mainly of consultation with the voluntary organisation and service users.

Comments: Consultation was given as an alternative by two of the nine who had not set up Joint Working Groups. As consultation tends to focus more on outcomes, this seems to miss the point of this exercise which is about the organisational effort required to raise standards.
It is also worth pointing out that feedback from service users is very much an under-developed practice and if the aim is to promote independence and self-confidence rather than customer satisfaction then it may not give an accurate picture.

Recommendation: Examine the efficacy of the other methods used for monitoring v.i. services.

4. Does the Group include a Commissioning or Performance Officer with a strategic perspective?

Yes 6, No 9.

Again only two of the nine authorities who did not have such an Officer gave any explanation. They mentioned consultation with service users and Customer Liaison and Complaints and Compliments Officers.

Comments: The two explanations given mentioned consultation with service users and Customer Liaison and Complaints and Compliments Officers. These do of course contribute to monitoring but, having observed authorities both with and without Commissioning Officers, there is no doubt in my mind that this is a critical post and It is not clear to me that the function and effect of such a post has been taken on board.

Recommendation: To investigate who carries out this function in all authorities and to understand their role in improving the capacity of the service.

5. Have terms of reference, admin arrangements & programme of review meetings been agreed for this Group?

Yes 6, No 9.

The one text reply from an authority without a Joint Working Group explained that there were reports to Sensory and Physical Disability team meetings.

Comments: Given that nine authorities have no Joint Working Groups, one has to wonder about their capacity to pursue a focused and measurable programme of improvement.

6. Has consideration been given to how service users will be involved in the process and has support been arranged to facilitate their full engagement?

Yes 9, No 6.

Positive responses give varied solutions as to how this is being achieved. There was a text response from only one of the authorities whose answer was negative.
One authority mentioned having difficulties in engaging service users in the specific business of monitoring the Good Practice Guidelines.

Comments: Nine positive replies to this question are encouraging. But there must still be concern about the others.

The comment on the difficulties raises the major issue of best practice in consulting service users and the different business of involving service users. This must also include the matter of the funding required.

Recommendations: Develop best practice guidelines on consultation with service users. Develop best practice guidelines on involving service users.

7. Have quality assurance systems been put in place to evidence achievement of standards, both quantitative (information systems) and qualitative (service user satisfaction)?

Yes 3, No 12.

Of the twelve negative responses, a text reply tells us that one authority is actually in the process of planning development in this area.

Comments: The response to this question is disturbing and clearly requires some movement on the part of the authorities.

Recommendation: Inquire into whether this could be improved in any way by a strategic and co-operative approach.

8. Have you carried out an initial review of the Good Practice Guide Standards and evaluated the status of current performance as Red (Standard not fully achieved), Blue (Standard fully achieved) or Pink (Standard on hold)?

Information provision:

Yes 11, No 4.

Referral and Assessment:

Yes 10, No 5.

Rehabilitation:

Yes 9, No 6.

Children and Young People:

Yes 6, No 9.

The eleven positive responses indicate the higher take-up of the Good Practice Guidelines as compared to the Monitoring tool.

The text responses indicated considerable progress on some of the standards.

Regarding Rehabilitation, one authority said that their Rehabilitation post was vacant and had been so for some time. Another authority said that any identified need that was unmet in-house would be commissioned.

The least used to date is the Children & Young People set. One of the negative responses said nevertheless that they are in the process of developing a plan to address this.

Comments: It is possible that the decrease in take-up from first to fourth is because these GPGs are being tackled in sequence. However Rehabilitation is a key service. Furthermore the gaps in services for v.i. Children & Young People revealed by the WCB programme of surveys signals the need for work in this area.

Recommendations: To carry out a details survey of the implementation of the GPGs for Rehabilitation. To carry out a details survey of the implementation of the GPGs for Children & Young People.

9. Have you transferred those standards evaluated as Red in the Minimum Standards column to an Action Plan Template?

Yes 5, No 10.

One authority's text response said that their Action Plan forms part of the Physical and Sensory Disabilities Business Plan.

One of the negative responses came from an authority whose text reply says that they are currently undertaking this task.

Comments: These questions - 9 to 14 - all apply to the implementation of the Monitoring tool and so are less relevant to those who have opted for other methods of measuring performance.

10. Have actions, timescales and lead officers been agreed to change the status from Red to Yellow (Agreed actions in place but not yet started)?

Yes 3, No 12.

The text responses were as for Q. 9.

11. Have you transferred those standards evaluated as Red in the Good Practice Standards column to an Action Plan Template?

Yes 4, No 11.

The text responses were as for Q. 9.

12. Have actions, timescales and lead officers been agreed to change the status from Red to Yellow (Agreed actions in place but not yet started)?

Yes 4, No 11.

The text responses were as for Q. 9.

13. Have you transferred those standards evaluated as Red in the Better Practice Standards column to an Action Plan Template?

Yes 3, No 12.

The text responses were as for Q. 9.

14. Have actions, timescales and lead officers been agreed to change the status from Red to Yellow (Agreed actions in place but not yet started)?

Yes 3, No 12.

The text responses were as for Q. 9.

One of the negative responses came from an authority whose text reply said that they are currently awaiting the appointment of a Lead Officer.

15 Has an on-going improvement & review programme for all standards in the Good Practice Guides been agreed?

Yes 4, No 11.

16. Any other comments you would like to make about how best to improve Social Services and measure the improvement for sight impaired people?

The replies here were varied and extremely helpful in forming a picture of what is happening in qualitative terms.

The standards in the GPGs are seen as a useful model for local authorities. However their implementation clearly differs according to local circumstances, for example by their inclusion in local Service/Business Planning systems.

Some practical problems have emerged such as staff capacity and re- organisation which have held up some authorities and one authority had found the Performance Monitoring system onerous.

It was suggested in one response that mandatory KPIs would be helpful.

More resources from WAG were recommended to ensure that data is captured accurately.

It was pointed out that the changed v.i. pathway is resulting in an increasing number of people being referred without certification.

The strength of all agencies recognising the importance of the standards and working together to achieve them was stressed.

The Assembly's Data Unit collects information on Deaf/HoH equipment and one constructive suggestion was that it should do the same for v.i. equipment.

Comments: The responses to this survey give sufficient evidence that the all-Wales approach has helped drive improvement and most important the survey has identified gaps in service planning.

Secondly, whilst it can be allowed that re-organisation of a service can cause problems, it can also promise a better service but staff capacity is a problem to be solved not a reason for an unsatisfactory performance.

Recommendations: A review should take place of the impact of the changes in the v.i. pathway on the capacity and funding of Social Services.

The Assembly's Data Unit should be pressed to collect information on v.i. equipment as it already does for Deaf/HoH equipment.

That the Benchmarking Network should be maintained and its administrative and research capacity be appropriately funded.

4. Discussion

In the light of the results of this survey, I believe that the following questions need to be debated:

(a) Do you agree with the recommendations above?
(b) Do you consider that the VI Benchmarking standards should continue to
be promoted?
(c) Do you consider that the Benchmarking Performance Monitoring
Process should continue to be promoted to measure performance?
(d) Does the all-Wales aspect of this work help local authorities to improve
services to vision impaired people in their area?
(e) Does the work of serving vision impaired people receive equivalent
attention and support at strategic, corporate, financial and management
levels to that of serving other sectors?
If not, what can be done to correct that situation?
(f) What further actions would you recommend for the Benchmarking
Network?

Resources.