Process Theory

14,000,000 Leading Edge Experts on the ideXlab platform

Scan Science and Technology

Contact Leading Edge Experts & Companies

Scan Science and Technology

Contact Leading Edge Experts & Companies

The Experts below are selected from a list of 324 Experts worldwide ranked by ideXlab platform

Anne Macfarlane - One of the best experts on this subject based on the ideXlab platform.

  • assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation Process Theory an integrative review
    PLOS ONE, 2017
    Co-Authors: Pauline Oreilly, Madeleine Osullivan, Walter Cullen, Catriona Kennedy, Anne Macfarlane
    Abstract:

    Background Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation Process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. Methods and findings An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. Conclusion A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects of implementation work. Systematic review registration International Prospective Register of Systematic Reviews PROSPERO 2015: CRD42015019362.

  • a qualitative systematic review of studies using the normalization Process Theory to research implementation Processes
    Implementation Science, 2014
    Co-Authors: Rachel Mcevoy, Frances S Mair, Luciana Ballini, Susanna Maltoni, Catherine A Odonnell, Anne Macfarlane
    Abstract:

    Background There is a well-recognized need for greater use of Theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social Processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT.

  • evaluating complex interventions and health technologies using normalization Process Theory development of a simplified approach and web enabled toolkit
    BMC Health Services Research, 2011
    Co-Authors: Tracy Finch, Frances S Mair, Anne Macfarlane, Shaun Treweek, Luciana Ballini, Elizabeth Murray, Tim Rapley
    Abstract:

    Normalization Process Theory (NPT) can be used to explain implementation Processes in health care relating to new technologies and complex interventions. This paper describes the Processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the Theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60) or direct phone and email contact (10/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationProcess.org . Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Normalization Process Theory has been developed through transparent procedures at each stage of its life. The Theory has been shown to be sufficiently robust to merit formal testing. This project has provided a user friendly version of NPT that can be embedded in a web-enabled toolkit and used as a heuristic device to think through implementation and integration problems.

  • normalisation Process Theory a framework for developing evaluating and implementing complex interventions
    BMC Medicine, 2010
    Co-Authors: Elizabeth Murray, Frances S Mair, Tracy Finch, Anne Macfarlane, Christopher Dowrick, Shaun Treweek, Luciana Ballini, Catherine Pope, Anne Kennedy, Catherine A Odonnell
    Abstract:

    Background: The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion: In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary: The NPT is a new Theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  • development of a Theory of implementation and integration normalization Process Theory
    Implementation Science, 2009
    Co-Authors: Frances S Mair, Tracy Finch, Anne Macfarlane, Christopher Dowrick, Shaun Treweek, Tim Rapley, Luciana Ballini, Anne Rogers, Elizabeth Murray, Glyn Elwyn
    Abstract:

    Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation Processes. This paper describes the Process by which it was built. Between 1998 and 2008, we developed a Theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal Theory through a Process of extension and implication analysis of the applied theoretical model. Each phase of Theory development showed that the constructs of the Theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the Theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range Theory. Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The Theory has been shown to merit formal testing.

Frances S Mair - One of the best experts on this subject based on the ideXlab platform.

  • improving the normalization of complex interventions part 2 validation of the nomad instrument for assessing implementation work based on normalization Process Theory npt
    BMC Medical Research Methodology, 2018
    Co-Authors: Tracy Finch, Frances S Mair, Shaun Treweek, Elizabeth Murray, Melissa Girling, Elaine Mccoll, I N Steen, Clare Cook, C R Vernazza, Nicola Mackintosh
    Abstract:

    Successful implementation and embedding of new health care practices relies on co-ordinated, collective behaviour of individuals working within the constraints of health care settings. Normalization Process Theory (NPT) provides a Theory of implementation that emphasises collective action in explaining, and shaping, the embedding of new practices. To extend the practical utility of NPT for improving implementation success, an instrument (NoMAD) was developed and validated. Descriptive analysis and psychometric testing of an instrument developed by the authors, through an iterative Process that included item generation, consensus methods, item appraisal, and cognitive testing. A 46 item questionnaire was tested in 6 sites implementing health related interventions, using paper and online completion. Participants were staff directly involved in working with the interventions. Descriptive analysis and consensus methods were used to remove redundancy, reducing the final tool to 23 items. Data were subject to confirmatory factor analysis which sought to confirm the theoretical structure within the sample. We obtained 831 completed questionnaires, an average response rate of 39% (range: 22–77%). Full completion of items was 50% (n = 413). The confirmatory factor analysis showed the model achieved acceptable fit (CFI = 0.95, TLI = 0.93, RMSEA = 0.08, SRMR = 0.03). Construct validity of the four theoretical constructs of NPT was supported, and internal consistency (Cronbach’s alpha) were as follows: Coherence (4 items, α = 0.71); Collective Action (7 items, α = 0.78); Cognitive Participation (4 items, α = 0.81); Reflexive Monitoring (5 items, α = 0.65). The normalisation scale overall, was highly reliable (20 items, α = 0.89). The NoMAD instrument has good face validity, construct validity and internal consistency, for assessing staff perceptions of factors relevant to embedding interventions that change their work practices. Uses in evaluating and guiding implementation are proposed.

  • using normalization Process Theory in feasibility studies and Process evaluations of complex healthcare interventions a systematic review
    Implementation Science, 2018
    Co-Authors: Amanda Cummings, Frances S Mair, Tim Rapley, Elizabeth Murray, Melissa Girling, Michael Bracher, Michelle Myall, Tracy Finch
    Abstract:

    Normalization Process Theory (NPT) identifies, characterises and explains key mechanisms that promote and inhibit the implementation, embedding and integration of new health techniques, technologies and other complex interventions. A large body of literature that employs NPT to inform feasibility studies and Process evaluations of complex healthcare interventions has now emerged. The aims of this review were to review this literature; to identify and characterise the uses and limits of NPT in research on the implementation and integration of healthcare interventions; and to explore NPT’s contribution to understanding the dynamics of these Processes. A qualitative systematic review was conducted. We searched Web of Science, Scopus and Google Scholar for articles with empirical data in peer-reviewed journals that cited either key papers presenting and developing NPT, or the NPT Online Toolkit ( www.normalizationProcess.org ). We included in the review only articles that used NPT as the primary approach to collection, analysis or reporting of data in studies of the implementation of healthcare techniques, technologies or other interventions. A structured data extraction instrument was used, and data were analysed qualitatively. Searches revealed 3322 citations. We show that after eliminating 2337 duplicates and broken or junk URLs, 985 were screened as titles and abstracts. Of these, 101 were excluded because they did not fit the inclusion criteria for the review. This left 884 articles for full-text screening. Of these, 754 did not fit the inclusion criteria for the review. This left 130 papers presenting results from 108 identifiable studies to be included in the review. NPT appears to provide researchers and practitioners with a conceptual vocabulary for rigorous studies of implementation Processes. It identifies, characterises and explains empirically identifiable mechanisms that motivate and shape implementation Processes. Taken together, these mean that analyses using NPT can effectively assist in the explanation of the success or failure of specific implementation projects. Ten percent of papers included critiques of some aspect of NPT, with those that did mainly focusing on its terminology. However, two studies critiqued NPT emphasis on agency, and one study critiqued NPT for its normative focus. This review demonstrates that researchers found NPT useful and applied it across a wide range of interventions. It has been effectively used to aid intervention development and implementation planning as well as evaluating and understanding implementation Processes themselves. In particular, NPT appears to have offered a valuable set of conceptual tools to aid understanding of implementation as a dynamic Process.

  • a qualitative systematic review of studies using the normalization Process Theory to research implementation Processes
    Implementation Science, 2014
    Co-Authors: Rachel Mcevoy, Frances S Mair, Luciana Ballini, Susanna Maltoni, Catherine A Odonnell, Anne Macfarlane
    Abstract:

    Background There is a well-recognized need for greater use of Theory to address research translational gaps. Normalization Process Theory (NPT) provides a set of sociological tools to understand and explain the social Processes through which new or modified practices of thinking, enacting, and organizing work are implemented, embedded, and integrated in healthcare and other organizational settings. This review of NPT offers readers the opportunity to observe how, and in what areas, a particular theoretical approach to implementation is being used. In this article we review the literature on NPT in order to understand what interventions NPT is being used to analyze, how NPT is being operationalized, and the reported benefits, if any, of using NPT.

  • improving the normalization of complex interventions measure development based on normalization Process Theory nomad study protocol
    Implementation Science, 2013
    Co-Authors: Tracy Finch, Frances S Mair, Shaun Treweek, Tim Rapley, Elizabeth Murray, Melissa Girling, Elaine Mccoll, I N Steen
    Abstract:

    Background: Understanding implementation Processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions. Objectives: The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this Process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures. Methods: A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings. Discussion: The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices.

  • evaluating complex interventions and health technologies using normalization Process Theory development of a simplified approach and web enabled toolkit
    BMC Health Services Research, 2011
    Co-Authors: Tracy Finch, Frances S Mair, Anne Macfarlane, Shaun Treweek, Luciana Ballini, Elizabeth Murray, Tim Rapley
    Abstract:

    Normalization Process Theory (NPT) can be used to explain implementation Processes in health care relating to new technologies and complex interventions. This paper describes the Processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the Theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60) or direct phone and email contact (10/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationProcess.org . Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Normalization Process Theory has been developed through transparent procedures at each stage of its life. The Theory has been shown to be sufficiently robust to merit formal testing. This project has provided a user friendly version of NPT that can be embedded in a web-enabled toolkit and used as a heuristic device to think through implementation and integration problems.

Paul Ralph - One of the best experts on this subject based on the ideXlab platform.

  • A teleological Process Theory of software development
    2020
    Co-Authors: Paul Ralph, Yair Wand
    Abstract:

    This paper presents a teleological Process Theory of software design in organizations. The proposed Theory is compared to the Function-Behavior-Structure (FBS) Framework – a leading Process Theory of engineering design proposed by John Gero. A positivist, multiple case study methodology to empirically compare the veracity and predictive power of the two theories described. Results from a pilot case suggest that the observed behaviors of the development team are better described by the proposed Theory than by the FBS Framework.

  • Software engineering Process Theory
    Information & Software Technology, 2016
    Co-Authors: Paul Ralph
    Abstract:

    ContextSoftware engineering has experienced increased calls for attention to Theory, including Process Theory and general Theory. However, few Process theories or potential general theories have been proposed and little empirical evaluation has been attempted. ObjectiveThe purpose of this paper is to empirically evaluate two previously untested software development Process theories - Sensemaking-Coevolution-Implementation Theory (SCI) and the Function-Behavior-Structure Framework (FBS). MethodA survey of more than 1300 software developers is combined with four longitudinal, positivist case studies to achieve a simultaneously broad and deep empirical evaluation. Instrument development, statistical analysis of questionnaire data, case data analysis using a closed-ended, a priori coding scheme and data triangulation are described. ResultsCase data analysis strongly supports SCI, as does analysis of questionnaire response distributions (p

  • Software engineering Process Theory: A multi-method comparison of Sensemaking-Coevolution-Implementation Theory and function-behavior-structure Theory
    Information and Software Technology, 2016
    Co-Authors: Paul Ralph
    Abstract:

    Context: Software engineering has experienced increased calls for attention to Theory, including Process Theory and general Theory. However, few Process theories or potential general theories have been proposed and little empirical evaluation has been attempted. Objective: The purpose of this paper is to empirically evaluate two previously untested software development Process theories - Sensemaking-Coevolution-Implementation Theory (SCI) and the Function-Behavior-Structure Framework (FBS). Method: A survey of more than 1300 software developers is combined with four longitudinal, positivist case studies to achieve a simultaneously broad and deep empirical evaluation. Instrument development, statistical analysis of questionnaire data, case data analysis using a closed-ended, a priori coding scheme and data triangulation are described. Results: Case data analysis strongly supports SCI, as does analysis of questionnaire response distributions (p < 0.001; chi-square goodness of fit test). Furthermore, case-questionnaire triangulation found no evidence that support for SCI varied by participants' gender, education, experience, nationality or the size or nature of their projects. Conclusions: SCI is supported. No evidence of an FBS subculture was found. This suggests that instead of iterating between weakly-coupled phases (analysis, design, coding, testing), it is more accurate and useful to conceptualize development as ad hoc oscillation between making sense of the project context (Sensemaking), simultaneously improving mental representations of the context and design space (Coevolution) and constructing, debugging and deploying software artifacts (Implementation).

  • AGILE - Explaining Agility with a Process Theory of Change
    2015 Agile Conference, 2015
    Co-Authors: Michael Wufka, Paul Ralph
    Abstract:

    While agile approaches have been widely adopted, our theoretical understanding of their foundations and impacts remains limited. This is due to conflating two entirely different meanings of "agile." We therefore unpack these two meanings and present our tentative understanding as a Process Theory. The Theory posits that agility emerges from a dialectic interplay between recognizing and responding to needs for changes. Meanwhile, rather than directly affecting success, agility moderates the negative effects of need for change on success. Viewing agility this way helps address the research-practice gap by highlighting the need for skepticism of methods and practices, and by suggesting practically relevant research questions.

Tracy Finch - One of the best experts on this subject based on the ideXlab platform.

  • the swedish version of the normalization Process Theory measure s nomad translation adaptation and pilot testing
    Implementation Science, 2018
    Co-Authors: Sofi Nordmark, Tracy Finch, Johan Lyhagen, Inger Lindberg, Anna Cristina Aberg
    Abstract:

    The original British instrument the Normalization Process Theory Measure (NoMAD) is based on the four core constructs of the Normalization Process Theory: Coherence, Cognitive Participation, Collective Action, and Reflexive Monitoring. They represent ways of thinking about implementation and are focused on how interventions can become part of everyday practice. To translate and adapt the original NoMAD into the Swedish version S-NoMAD and to evaluate its psychometric properties based on a pilot test in a health care context including in-hospital, primary, and community care contexts. A systematic approach with a four-step Process was utilized, including forward and backward translation and expert reviews for the test and improvement of content validity of the S-NoMAD in different stages of development. The final S-NoMAD version was then used for Process evaluation in a pilot study aimed at the implementation of a new working method for individualized care planning. The pilot was executed in two hospitals, four health care centres, and two municipalities in a region in northern Sweden. The S-NoMAD pilot results were analysed for validity using confirmatory factor analysis, i.e. a one-factor model fitted for each of the four constructs of the S-NoMAD. Cronbach’s alpha was used to ascertain the internal consistency reliability. In the pilot, S-NoMAD data were collected from 144 individuals who were different health care professionals or managers. The initial factor analysis model showed good fit for two of the constructs (Coherence and Cognitive Participation) and unsatisfactory fit for the remaining two (Collective Action and Reflexive Monitoring) based on three items. Deleting those items from the model yielded a good fit and good internal consistency (alphas between 0.78 and 0.83). However, the estimation of correlations between the factors showed that the factor Reflexive Monitoring was highly correlated (around 0.9) with the factors Coherence and Collective Action. The results show initial satisfactory psychometric properties for the translation and first validation of the S-NoMAD. However, development of a highly valid and reliable instrument is an iterative Process, requiring more extensive validation in various settings and populations. Thus, in order to establish the validity and reliability of the S-NoMAD, additional psychometric testing is needed.

  • improving the normalization of complex interventions part 2 validation of the nomad instrument for assessing implementation work based on normalization Process Theory npt
    BMC Medical Research Methodology, 2018
    Co-Authors: Tracy Finch, Frances S Mair, Shaun Treweek, Elizabeth Murray, Melissa Girling, Elaine Mccoll, I N Steen, Clare Cook, C R Vernazza, Nicola Mackintosh
    Abstract:

    Successful implementation and embedding of new health care practices relies on co-ordinated, collective behaviour of individuals working within the constraints of health care settings. Normalization Process Theory (NPT) provides a Theory of implementation that emphasises collective action in explaining, and shaping, the embedding of new practices. To extend the practical utility of NPT for improving implementation success, an instrument (NoMAD) was developed and validated. Descriptive analysis and psychometric testing of an instrument developed by the authors, through an iterative Process that included item generation, consensus methods, item appraisal, and cognitive testing. A 46 item questionnaire was tested in 6 sites implementing health related interventions, using paper and online completion. Participants were staff directly involved in working with the interventions. Descriptive analysis and consensus methods were used to remove redundancy, reducing the final tool to 23 items. Data were subject to confirmatory factor analysis which sought to confirm the theoretical structure within the sample. We obtained 831 completed questionnaires, an average response rate of 39% (range: 22–77%). Full completion of items was 50% (n = 413). The confirmatory factor analysis showed the model achieved acceptable fit (CFI = 0.95, TLI = 0.93, RMSEA = 0.08, SRMR = 0.03). Construct validity of the four theoretical constructs of NPT was supported, and internal consistency (Cronbach’s alpha) were as follows: Coherence (4 items, α = 0.71); Collective Action (7 items, α = 0.78); Cognitive Participation (4 items, α = 0.81); Reflexive Monitoring (5 items, α = 0.65). The normalisation scale overall, was highly reliable (20 items, α = 0.89). The NoMAD instrument has good face validity, construct validity and internal consistency, for assessing staff perceptions of factors relevant to embedding interventions that change their work practices. Uses in evaluating and guiding implementation are proposed.

  • using normalization Process Theory in feasibility studies and Process evaluations of complex healthcare interventions a systematic review
    Implementation Science, 2018
    Co-Authors: Amanda Cummings, Frances S Mair, Tim Rapley, Elizabeth Murray, Melissa Girling, Michael Bracher, Michelle Myall, Tracy Finch
    Abstract:

    Normalization Process Theory (NPT) identifies, characterises and explains key mechanisms that promote and inhibit the implementation, embedding and integration of new health techniques, technologies and other complex interventions. A large body of literature that employs NPT to inform feasibility studies and Process evaluations of complex healthcare interventions has now emerged. The aims of this review were to review this literature; to identify and characterise the uses and limits of NPT in research on the implementation and integration of healthcare interventions; and to explore NPT’s contribution to understanding the dynamics of these Processes. A qualitative systematic review was conducted. We searched Web of Science, Scopus and Google Scholar for articles with empirical data in peer-reviewed journals that cited either key papers presenting and developing NPT, or the NPT Online Toolkit ( www.normalizationProcess.org ). We included in the review only articles that used NPT as the primary approach to collection, analysis or reporting of data in studies of the implementation of healthcare techniques, technologies or other interventions. A structured data extraction instrument was used, and data were analysed qualitatively. Searches revealed 3322 citations. We show that after eliminating 2337 duplicates and broken or junk URLs, 985 were screened as titles and abstracts. Of these, 101 were excluded because they did not fit the inclusion criteria for the review. This left 884 articles for full-text screening. Of these, 754 did not fit the inclusion criteria for the review. This left 130 papers presenting results from 108 identifiable studies to be included in the review. NPT appears to provide researchers and practitioners with a conceptual vocabulary for rigorous studies of implementation Processes. It identifies, characterises and explains empirically identifiable mechanisms that motivate and shape implementation Processes. Taken together, these mean that analyses using NPT can effectively assist in the explanation of the success or failure of specific implementation projects. Ten percent of papers included critiques of some aspect of NPT, with those that did mainly focusing on its terminology. However, two studies critiqued NPT emphasis on agency, and one study critiqued NPT for its normative focus. This review demonstrates that researchers found NPT useful and applied it across a wide range of interventions. It has been effectively used to aid intervention development and implementation planning as well as evaluating and understanding implementation Processes themselves. In particular, NPT appears to have offered a valuable set of conceptual tools to aid understanding of implementation as a dynamic Process.

  • improving the normalization of complex interventions measure development based on normalization Process Theory nomad study protocol
    Implementation Science, 2013
    Co-Authors: Tracy Finch, Frances S Mair, Shaun Treweek, Tim Rapley, Elizabeth Murray, Melissa Girling, Elaine Mccoll, I N Steen
    Abstract:

    Background: Understanding implementation Processes is key to ensuring that complex interventions in healthcare are taken up in practice and thus maximize intended benefits for service provision and (ultimately) care to patients. Normalization Process Theory (NPT) provides a framework for understanding how a new intervention becomes part of normal practice. This study aims to develop and validate simple generic tools derived from NPT, to be used to improve the implementation of complex healthcare interventions. Objectives: The objectives of this study are to: develop a set of NPT-based measures and formatively evaluate their use for identifying implementation problems and monitoring progress; conduct preliminary evaluation of these measures across a range of interventions and contexts, and identify factors that affect this Process; explore the utility of these measures for predicting outcomes; and develop an online users’ manual for the measures. Methods: A combination of qualitative (workshops, item development, user feedback, cognitive interviews) and quantitative (survey) methods will be used to develop NPT measures, and test the utility of the measures in six healthcare intervention settings. Discussion: The measures developed in the study will be available for use by those involved in planning, implementing, and evaluating complex interventions in healthcare and have the potential to enhance the chances of their implementation, leading to sustained changes in working practices.

  • evaluating complex interventions and health technologies using normalization Process Theory development of a simplified approach and web enabled toolkit
    BMC Health Services Research, 2011
    Co-Authors: Tracy Finch, Frances S Mair, Anne Macfarlane, Shaun Treweek, Luciana Ballini, Elizabeth Murray, Tim Rapley
    Abstract:

    Normalization Process Theory (NPT) can be used to explain implementation Processes in health care relating to new technologies and complex interventions. This paper describes the Processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the Theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60) or direct phone and email contact (10/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationProcess.org . Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Normalization Process Theory has been developed through transparent procedures at each stage of its life. The Theory has been shown to be sufficiently robust to merit formal testing. This project has provided a user friendly version of NPT that can be embedded in a web-enabled toolkit and used as a heuristic device to think through implementation and integration problems.

Glyn Elwyn - One of the best experts on this subject based on the ideXlab platform.

  • patchy coherence using normalization Process Theory to evaluate a multi faceted shared decision making implementation program magic
    Implementation Science, 2013
    Co-Authors: Amy Lloyd, Glyn Elwyn, Natalie Josephwilliams, Adrian Edwards
    Abstract:

    Background: Implementing shared decision making into routine practice is proving difficult, despite considerable interest from policy-makers, and is far more complex than merely making decision support interventions available to patients. Few have reported successful implementation beyond research studies. MAking Good Decisions In Collaboration (MAGIC) is a multi-faceted implementation program, commissioned by The Health Foundation (UK), to examine how best to put shared decision making into routine practice. In this paper, we investigate healthcare professionals’ perspectives on implementing shared decision making during the MAGIC program, to examine the work required to implement shared decision making and to inform future efforts. Methods: The MAGIC program approached implementation of shared decision making by initiating a range of interventions including: providing workshops; facilitating development of brief decision support tools (Option Grids); initiating a patient activation campaign (‘Ask 3 Questions’); gathering feedback using Decision Quality Measures; providing clinical leads meetings, learning events, and feedback sessions; and obtaining executive board level support. At 9 and 15 months (May and November 2011), two rounds of semi-structured interviews were conducted with healthcare professionals in three secondary care teams to explore views on the impact of these interventions. Interview data were coded by two reviewers using a framework derived from the Normalization Process Theory. Results: A total of 54 interviews were completed with 31 healthcare professionals. Partial implementation of shared decision making could be explained using the four components of the Normalization Process Theory: ‘coherence,’ ‘cognitive participation, ’‘ collective action,’ and ‘reflexive monitoring.’ Shared decision making was integrated into routine practice when clinical teams shared coherent views of role and purpose (‘coherence’). Shared decision making was facilitated when teams engaged in developing and delivering interventions (‘cognitive participation’), and when those interventions fit with existing skill sets and organizational priorities (‘collective action’) resulting in demonstrable improvements to practice (‘reflexive monitoring’). The implementation Process uncovered diverse and conflicting attitudes toward shared decision making; ‘coherence’ was often missing. Conclusions: The study showed that implementation of shared decision making is more complex than the delivery of patient decision support interventions to patients, a portrayal that often goes unquestioned. Normalizing shared decision making requires intensive work to ensure teams have a shared understanding of the purpose of involving patients in decisions, and undergo the attitudinal shifts that many health professionals feel are required when comprehension goes beyond initial interpretations. Divergent views on the value of engaging patients in decisions remain a significant barrier to implementation.

  • development of a Theory of implementation and integration normalization Process Theory
    Implementation Science, 2009
    Co-Authors: Frances S Mair, Tracy Finch, Anne Macfarlane, Christopher Dowrick, Shaun Treweek, Tim Rapley, Luciana Ballini, Anne Rogers, Elizabeth Murray, Glyn Elwyn
    Abstract:

    Theories are important tools in the social and natural sciences. The methods by which they are derived are rarely described and discussed. Normalization Process Theory explains how new technologies, ways of acting, and ways of working become routinely embedded in everyday practice, and has applications in the study of implementation Processes. This paper describes the Process by which it was built. Between 1998 and 2008, we developed a Theory. We derived a set of empirical generalizations from analysis of data collected in qualitative studies of healthcare work and organization. We developed an applied theoretical model through analysis of empirical generalizations. Finally, we built a formal Theory through a Process of extension and implication analysis of the applied theoretical model. Each phase of Theory development showed that the constructs of the Theory did not conflict with each other, had explanatory power, and possessed sufficient robustness for formal testing. As the Theory developed, its scope expanded from a set of observed regularities in data with procedural explanations, to an applied theoretical model, to a formal middle-range Theory. Normalization Process Theory has been developed through procedures that were properly sceptical and critical, and which were opened to review at each stage of development. The Theory has been shown to merit formal testing.