LOGIN TO YOUR ACCOUNT

Username
Password
Remember Me
Or use your Academic/Social account:

CREATE AN ACCOUNT

Or use your Academic/Social account:

Congratulations!

You have just completed your registration at OpenAire.

Before you can login to the site, you will need to activate your account. An e-mail will be sent to you with the proper instructions.

Important!

Please note that this site is currently undergoing Beta testing.
Any new content you create is not guaranteed to be present to the final version of the site upon release.

Thank you for your patience,
OpenAire Dev Team.

Close This Message

CREATE AN ACCOUNT

Name:
Username:
Password:
Verify Password:
E-mail:
Verify E-mail:
*All Fields Are Required.
Please Verify You Are Human:
fbtwitterlinkedinvimeoflicker grey 14rssslideshare1
Publisher: NIHR Health Technology Assessment Programme
Languages: English
Types: Article
Subjects:
Identifiers:doi:10.3310/hsdr05050
Background: The Health and Social Care Act 2012 (Great Britain. Health and Social Care Act 2012. London: The Stationery Office; 2012) has mandated research use as a core consideration of health service commissioning arrangements. We evaluated whether or not access to a demand-led evidence briefing service improved the use of research evidence by commissioners, compared with less intensive and less targeted alternatives. Design: Controlled before-and-after study. Setting: Clinical Commissioning Groups (CCGs) in the north of England. Main outcome measures: Change at 12 months from baseline of a CCG’s ability to acquire, assess, adapt and apply research evidence to support decision-making. Secondary outcomes measured individual clinical leads’ and managers’ intentions to use research evidence in decision-making. Methods: Nine CCGs received one of three interventions: (1) access to an evidence briefing service; (2) contact plus an unsolicited push of non-tailored evidence; or (3) an unsolicited push of non-tailored evidence. Data for the primary outcome measure were collected at baseline and 12 months post intervention, using a survey instrument devised to assess an organisation’s ability to acquire, assess, adapt and apply research evidence to support decision-making. In addition, documentary and observational evidence of the use of the outputs of the service was sought and interviews with CCG participants were undertaken. Results: Most of the requests were conceptual; they were not directly linked to discrete decisions or actions but were intended to provide knowledge about possible options for future actions. Symbolic use to justify existing decisions and actions were less frequent and included a decision to close a walk-in centre and to lend weight to a major initiative to promote self-care already under way. The opportunity to impact directly on decision-making processes was limited to work to establish disinvestment policies. In terms of impact overall, the evidence briefing service was not associated with increases in CCGs’ capacity to DOI: 10.3310/hsdr05050 HEALTH SERVICES AND DELIVERY RESEARCH 2017 VOL. 5 NO. 5 © Queen’s Printer and Controller of HMSO 2017. This work was produced by Wilson et al. under the terms of a commissioning contract issued by the Secretary of State for Health. This issue may be freely reproduced for the purposes of private research and study and extracts (or indeed, the full report) may be included in professional journals provided that suitable acknowledgement is made and the reproduction is not associated with any form of advertising. Applications for commercial reproduction should be addressed to: NIHR Journals Library, National Institute for Health Research, Evaluation, Trials and Studies Coordinating Centre, Alpha House, University of Southampton Science Park, Southampton SO16 7NS, UK. v acquire, assess, adapt and apply research evidence to support decision-making, individual intentions to use research findings or perceptions of CCGs’ relationships with researchers. Regardless of the intervention received, at baseline participating CCGs indicated that they felt that they were inconsistent in their research-seeking behaviours and their capacity to acquire research remained so at follow-up. The informal nature of decision-making processes meant that there was little or no traceability of the use of evidence. Limitations: Low baseline and follow-up response rates (of 68% and 44%, respectively) and missing data limit the reliability of these findings. Conclusions: Access to a demand-led evidence briefing service did not improve the uptake and use of research evidence by NHS commissioners compared with less intensive and less targeted alternatives. Commissioners appear to be well intentioned but ad hoc users of research. Future work: Further research is required on the effects of interventions and strategies to build individual and organisational capacity to use research. Resource-intensive approaches to providing evidence may best be employed to support instrumental decision-making. Comparative evaluation of the impact of less intensive but targeted strategies on the uptake and use of research by commissioners is warranted. Funding: The National Institute for Health Research Health Services and Delivery Research programme.
  • The results below are discovered through our pilot algorithms. Let us know how we are doing!

    • 50
  • No related research data.
  • No similar publications.