Meta-analyses in managementWhat can we learn from clinical research

  1. Antonio Sartal 1
  2. Miguel González-Loureiro 1
  3. Xosé H. Vázquez 1
  1. 1 University of Vigo, Vigo, Spain
Revista:
Business Research Quarterly

ISSN: 2340-9444 2340-9436

Ano de publicación: 2021

Volume: 24

Número: 1

Páxinas: 91-111

Tipo: Artigo

DOI: 10.1177/2340944420916310 DIALNET GOOGLE SCHOLAR lock_openAcceso aberto editor

Outras publicacións en: Business Research Quarterly

Resumo

We analyze the weaknesses of meta-analyses (MAs) in management research using as benchmark a scientific field where this technique shows a longer tradition: clinical research. We suggest four areas in which management research MA practices should improve: (1) availability of information and replicability of primary research, (2) correct application of statistical support, (3) execution of heterogeneity analyses, and (4) standardization of result reporting. Using a representative MA on an operations management topic, we identify qualitatively the aspects to be improved at each stage. We show the different results that could have been achieved by following standard procedures in clinical research, incorporating different “good practices” from this research field. Overall, these recommendations aim at improving the transparency and replicability of MAs, which can not only facilitate the accumulation of scientific knowledge but also intensify the dialogue between academia and practitioners.

Información de financiamento

Our work has received financial support from the Spanish and the Galician Governments through grants ECO2016-76625-R and ED431C 2018/46, respectively. Miguel González-Loureiro acknowledges support of Portuguese national funds from FCT – Fundação para a Ciência e Tecnologia through project UIDB/04728/2020.

Financiadores

Referencias bibliográficas

  • Abreu-Ledón, R., Luján-García, D. E., Garrido-Vega, P., EscobarPérez, B. (2018). A meta-analytic study of the impact of Lean Production on business performance. International Journal of Production Economics, 200, 83–102.
  • Aguinis, H., Beaty, J. C., Boik, R. J., Pierce, C. A. (2005). Effect size and power in assessing moderating effects of categorical variables using multiple regression: A 30 -year review. Journal of Applied Psychology, 90, 94–107.
  • Aguinis, H., Sturman, M. C., Pierce, C. A. (2008). Comparison of three meta-analytic procedures for estimating moderating effects of categorical variables. Organizational Research Methods, 11, 9–34.
  • Aguinis, H., Pierce, C. A., Culpepper, S. A. (2009). Scale coarseness as a methodological artifact: Correcting correlation coefficients attenuated from using coarse scales. Organizational Research Methods, 12, 623–652.
  • Aguinis, H., Dalton, D. R., Bosco, F. A., Pierce, C. A., Dalton, C. M. (2011a). Meta-analytic choices and judgment calls: Implications for theory building and testing, obtained effect sizes, and scholarly impact. Journal of Management, 37, 5–38.
  • Aguinis, H., Gottfredson, R. K., Wright, T. A. (2011b). Bestpractice recommendations for estimating interaction effects using meta-analysis. Journal of Organizational Behavior, 32, 1033–1043.
  • Aguinis, H., Pierce, C. A., Bosco, F. A., Dalton, D. R., Dalton, C. M. (2011c). Debunking myths and urban legends about metaanalysis. Organizational Research Methods, 14, 306–331.
  • Anzures-Cabrera, J., Higgins, J. P. (2010). Graphical displays for meta-analysis: An overview with suggestions for practice. Research Synthesis Methods, 1, 66–80.
  • *Avittathur, B., Swamidass, P. (2007). Matching plant flexibility and supplier flexibility: Lessons from small suppliers of U.S. manufacturing plants in India. Journal of Operations Management, 25, 717–735.
  • Aytug, Z. G., Rothstein, H. R., Zhou, W., Kern, M. C. (2012). Revealed or concealed? Transparency of procedures, decisions, and judgment calls in meta-analyses. Organizational Research Methods, 15, 103–133.
  • Borenstein, M., Hedges, L. V., Higgins, J. P. T., Rothstein, H. R. (2009). Introduction to meta-analysis. John Wiley.
  • Bosco, F. A., Uggerslev, K. L., Steel, P. (2017). metaBUS as a vehicle for facilitating meta-analysis. Human Resource Management Review, 27, 237–254.
  • *Callen, J. L., Fader, C., Krinsky, I. (2000). Just–in–time: A cross–sectional plant analysis. International Journal of Production Economics, 6, 277–301.
  • Cao, Z., Lumineau, F. (2015). Revisiting the interplay between contractual and relational governance: A qualitative and meta-analytic investigation. Journal of Operations Management, 33, 15–42.
  • Carlson, K. D., Ji, F. X. (2011). Citing and building on metaanalytic findings: A review and recommendations. Organizational Research Methods, 14, 696–717.
  • *Challis, D., Samson, D., Lawson, B. (2005). Impact of technological, organizational and human resource investments on employee and manufacturing performance: Australian and New Zealand evidence. International Journal of Production Research, 43, 81–107.
  • Chen, J., Damanpour, F., Reilly, R. R. (2010). Understanding antecedents of new product development speed: A metaanalysis. Journal of Operations Management, 28, 17–33.
  • Chowdhry, A. K., Dworkin, R. H., McDermott, M. P. (2016). MA with missing study-level sample variance data. Statistics in Medicine, 35, 3021–3032.
  • *Claycomb, C., Droge, C., Germain, R. (1999). The effect of just-intime with customs on organizational design and performance. International Journal of Logistics Management, 10, 37–58.
  • Cooper, H., Hedges, L. V. (1994a). The handbook of research synthesis and meta-analysis. Russell Sage Foundation.
  • Cooper, H., Hedges, L. V. (1994b). Potentials and limitations of research synthesis. In H. Cooper, L. V. Hedges (Eds.), The handbook of research synthesis and meta-analysis (pp. 521–530). Russell Sage Foundation.
  • Cooper, H., Hedges, L. V., Valentine, J. C. (2009). The handbook of research synthesis and meta-analysis (2nd ed.). Russell Sage Foundation.
  • Cooper, H. M. (1990). On the social psychology of using research reviews. In Wachter K. W., Straf M. L. (Eds.), The future of meta-analysis (pp. 75–88). Russell Sage Foundation.
  • Cortina, J. M. (2003). Apples and oranges (and pears, oh my!): The search for moderators in meta-analysis. Organizational Research Methods, 6, 415–439.
  • Croucher, R. (2019). Research methods and management. British Journal of Management. https://doi.org/10.1111/1467-8551.12347
  • Dabic, M., González-Loureiro, M., Furrer, O. (2014). Research on the strategy of multinational enterprises: Key approaches and new avenues. BRQ Business Research Quarterly, 17, 129–148.
  • *Dal Pont, G., Furlan, A., Vinelli, A. (2008). Interrelationships among lean bundles and their effects on operational performance. Operations Management Research, 1, 150–158.
  • Dalton, D. R., Dalton, C. M. (2008). Meta-analyses—Some very good steps toward a bit longer journey. Organizational Research Methods, 11, 127–147.
  • *Das, A., Jayaram, J. (2003). Relative importance of contingency variables for advanced manufacturing technology. International Journal of Production Research, 41, 4429–4452.
  • *Dean, S., Jr., Snell, A. (1991). Integrated manufacturing and job design: Moderating effects of organizational inertia. Academy of Management Journal, 34, 776–804.
  • Del Re, A. C., Hoyt, W. T. (2010). MAc: Meta-analysis with correlations. R Package Version 1.0.5 [Computer software]. http://CRAN.R–project.org/package_MAc
  • DerSimonian, R., Laird, N. (1986). Meta-analysis in clinical trials. Controlled Clinical Trials, 7, 177–188.
  • *Flynn, B. B., Sakakibara, S., Schroeder, R. G. (1995). Relationship between JIT and TQM: Practices and performance. Academy of Management Journal, 38, 1325–1360.
  • *Forza, C. (1996). Achieving superior operating performance from integrated pipeline management: An empirical study. International Journal of Physical Distribution and Logistics Management, 26, 36–63.
  • Forza, C., Di Nuzzo, F. (1998). Meta-analysis applied to operations management: Summarizing the results of empirical research. International Journal of Production Research, 36, 837–861.
  • *Fullerton, C., McWatters, S. (2001). The production performance benefits from JIT implementation. Journal of Operations Management, 19, 81–96.
  • Geyskens, I., Krishnan, R., Steenkamp, J. B. E., Cunha, P. V. (2009). A review and evaluation of meta-analysis practices in management research. Journal of Management, 35, 393–419.
  • Grand, J. A., Rogelberg, S. G., Allen, T. D., Landis, R. S., Reynolds, D. H., Scott, J. C., Tonidandel, S., Truxillo, D. M. (2018). A systems–based approach to fostering robust science in industrial–organizational psychology.’ Industrial and Organizational Psychology, 11, 4–42.
  • *He, J., Hayya, C. (2002). The impact of just–in–time production on food quality. Total Quality Management, 13, 651–670.
  • Hedges, L. V. (1989). An unbiased correction for sampling error in validity generalization studies. Journal of Applied Psychology, 74, 469–477.
  • Hedges, L. V., Olkin, I. (1985). Statistical methods for metaanalysis. Academic Press.
  • Hedges, L. V., Vevea, J. L. (1998). Fixed–and random effects models in meta-analysis. Psychological Methods, 3, 486–504.
  • Higgins, J. P., Green, S. (Eds.) (2011). Cochrane handbook for systematic reviews of interventions (Vol. 4). John Wiley & Sons.
  • Higgins, J. P., Thompson, T. S. G. (2002). Quantifying heterogeneity in a meta-analysis. Statistics in Medicine, 21, 1539–1558.
  • Higgins, J. P., Thompson, S. G., Spiegelhalter, D. J. (2009). A reevaluation of random-effects meta-analysis. Journal of the Royal Statistical Society: Series A (Statistics in Society), 172, 137–159.
  • Higgins, J. P., Whitehead, A., Simmonds, M. (2011). Sequential methods for random-effects meta-analysis. Statistics in Medicine, 30, 903–921.
  • Horstmeier, C. A., Boer, D., Homan, A. C., Voelpel, S. C. (2017). The differential effects of transformational leadership on multiple identifications at work: A meta-analytic model. British Journal of Management, 28, 280–298.
  • Hunter, J. E., Schmidt, F. L. (1990). Methods of meta-analysis: Correcting error and bias in research findings. SAGE.
  • Hunter, J. E., Schmidt, F. L. (1994). Correcting for sources of artificial variation across studies. In H. Cooper, L. V. Hedges (Eds.), The handbook of research synthesis and meta-analysis (pp. 323–336). Russell Sage foundation.
  • Hunter, J. E., Schmidt, F. L. (2000). Fixed effects vs. random effects meta-analysis models: Implications for cumulative research knowledge. International Journal of Selection and Assessment, 8, 275–292.
  • Hunter, J. E., Schmidt, F. L. (2004). Methods of meta-analysis: Correcting error and bias in research findings. SAGE.
  • *Jayaram, J., Vickery, S. K. (1998). Supply-based strategies, human resource initiatives, procurement leadtime, and firm performance. Journal of Supply Chain Management, 34, 12–24.
  • Kepes, S., Banks, G. C., McDaniel, M., Whetzel, D. L. (2012). Publication bias in the organizational sciences. Organizational Research Methods, 15, 624–662.
  • Kepes, S., McDaniel, M. A., Brannick, M. T., Banks, G. C. (2013). Meta-analytic reviews in the organizational sciences: Two meta-analytic schools on the way to MARS (the Meta-analytic Reporting Standards). Journal of Business and Psychology, 28, 123–143.
  • *Ketokivi, R., Schroeder, G. (2004). Manufacturing practices, strategic fit and performance: A routine–based view. International Journal of Operations and Production Management, 24, 171–191.
  • Kisamore, J. L., Brannick, M. T. (2008). An illustration of the consequences of meta-analysis model choice. Organizational Research Methods, 11, 35–53.
  • Kolev, K. D. (2016). To divest or not to divest: A meta-analysis of the antecedents of corporate divestitures. British Journal of Management, 27, 179–196.
  • *Lawrence, M., Hottenstein, P. (1995). The relationship between JIT manufacturing and performance in Mexican plants affiliated with US companies. Journal of Operations Management, 13, 3–18.
  • Le, H., Schmidt, F. L., Putka, D. J. (2009). The multifaceted nature of measurement artifacts and its implications for estimating construct–level relationships. Organizational Research Methods, 12, 165–200.
  • *Li, S., Rao, S., Ragu-Nathan, T. S., Ragu–Nathan, B. (2005). Development and validation of a measurement instrument for studying supply chain management practices. Journal of Operations Management, 23, 618–641.
  • Lipsey, M. W., Wilson, D. B. (2001). Practical meta-analysis. SAGE.
  • Mackelprang, A. W., Nair, A. (2010). Relationship between just– in–time manufacturing practices and performance: A metaanalytic investigation. Journal of Operations Management, 28, 283–302.
  • Marin-Garcia, J. A. (2015). Publishing in two phases for focused research by means of “research collaborations.” WPOMWorking Papers on Operations Management, 6, 76–80.
  • Martín-de Castro, G., Díez-Vial, I., Delgado-Verde, M. (2019). Intellectual capital and the firm: Evolution and research trends. Journal of Intellectual Capital, 20, 555–580.
  • Martínez-Noya, A., Narula, R. (2018). What more can we learn from R&D alliances? A review and research agenda. BRQ Business Research Quarterly, 21, 195–212.
  • *Matsui, Y. (2007). An empirical analysis of just–in–time production in Japanese manufacturing companies. International Journal of Production Economics, 108, 153–164.
  • *McKone, R., Schroeder, G., Cua, K. O. (2001). The impact of total productive maintenance practices on manufacturing performance. Journal of Operations Management, 19, 29–58.
  • *Mehra, R., Inman, A. (1992). Determining the critical elements of just–in–time implementation. Decision Sciences, 23, 160–174.
  • Miller, N., Pollock, V. E. (1994). Meta-analytic synthesis for theory development. In H. Cooper, L. V. Hedges (Eds.), The handbook of research synthesis and meta-analysis (pp. 457–486). Russell Sage Foundation.
  • Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A. (2015). Preferred Reporting Items for Systematic Review and Meta-Analysis Protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4, 1–9.
  • *Nahm, M., Vonderembse, A., Koufteros, X. A. (2004). The impact of organizational culture on time–based manufacturing and performance. Decision Sciences, 35, 579–607.
  • *Narasimhan, R., Swink, M., Kim, S. W. (2006). Disentangling leanness and agility: An empirical investigation. Journal of Operations Management, 24, 440–457.
  • Novianti, P. W., Roes, K. C., van der Tweel, I. (2014). Estimation of between-trial variance in sequential meta-analyses: A simulation study. Contemporary Clinical Trials, 37, 129–138.
  • Olkin, I., Pratt, J. W. (1958). Unbiased estimation of certain correlation coefficients. Annals of Mathematical Statistics, 29, 201–211.
  • Overton, R. C. (1998). A comparison of fixed–effects and mixed (random–effects) models for meta-analysis tests of moderator variable effects. Psychological Methods, 3, 354–379.
  • Paule, R. C., Mandel, J. (1982). Consensus values and weighting factors. Journal of Research of the National Bureau of Standards, 87, 377–385.
  • Peters, J. L., Sutton, A. J., Jones, D. R., Abrams, K. R., Rushton, L. (2008). Contour-enhanced meta-analysis funnel plots help distinguish publication bias from other causes of asymmetry. Journal of Clinical Epidemiology, 61, 991–996.
  • Raudenbush, S. W. (1994). Random effects models. In H. Cooper, L. V. Hedges (Eds.), The handbook of research synthesis (pp. 301–321). Russell Sage Foundation.
  • R Core Team. (2017). R: A language and environment for statistical computing. R foundation for statistical computing [Computer software].
  • Rosenzweig, E. D., Easton, G. S. (2010). Tradeoffs in manufacturing? A meta-analysis and critique of the literature. Production and Operations Management, 19, 127–141.
  • *Sakakibara, B., Flynn, B., Schroeder, R. G. (1993). A framework and measurement instrument for just–in–time manufacturing. Production and Operations Management, 2, 77–194.
  • Sangnawakij, P., Böhning, D., Adams, S., Stanton, M., Holling, H. (2017). Statistical methodology for estimating the mean difference in a MA without study-specific variance information. Statistics in Medicine, 36, 1395–1413.
  • Schild, A. H., Voracek, M. (2013). Less is less: A systematic review of graph use in meta-analyses. Research Synthesis Methods, 4, 209–219.
  • Schmidt, J., Hunter, E. (2015). Methods of meta-analysis: Correcting error and bias in research findings (3rd ed.). SAGE.
  • Schulze, R. (2004). Meta-analysis: A comparison of approaches. Hogrefe & Huber.
  • *Shah, P., Ward, T. (2003). Lean manufacturing: Context, practice bundles, and performance. Journal of Operations Management, 21, 129–149.
  • Sidik, K., Jonkman, J. N. (2007). A comparison of heterogeneity variance estimators in combining results of studies. Statistics in Medicine, 26, 1964–1981.
  • *Sim, A., Curtola, P. (1999). Time–based competition. International Journal of Quality and Reliability Management, 16, 659–674.
  • Stroup, D. F., Berlin, J. A., Morton, S. C., Olkin, I., Williamson, G. D., Rennie, D., Moher, D., Becker, B. J., Sipe, T. A., Thacker, S. B. (2000). Meta-analysis of observational studies in epidemiology: A proposal for reporting. Journal of American Medical Association, 283(15), 2008–2012.
  • Suurmond, R., van Rhee, H., Hak, T. (2017). Introduction, comparison, and validation of meta-essentials: A free and simple tool for meta-analysis. Research Synthesis Methods, 8, 537–553.
  • *Swink, M., Narasimhan, R., Kim, S. W. (2005). Manufacturing practices and strategy integration: Effects on cost efficiency, flexibility, and market–based performance. Decision Sciences, 36, 427–475.
  • Thomé, A. M. T., Scavarda, L. F., Scavarda, A. J. (2016). Conducting systematic literature review in operations management. Production Planning & Control, 27, 408–420.
  • Vandenbroucke, J. P., Von Elm, E., Altman, D. G., Gøtzsche, P. C., Mulrow, C. D., Pocock, S. J., Poole, C., Schlesselman, J. J., Egger, M. (2007). Strengthening the Reporting of Observational Studies in Epidemiology (STROBE): Explanation and elaboration. PLOS Medicine, 4, Article 297.
  • Veroniki, A. A., Jackson, D., Viechtbauer, W., Bender, R., Bowden, J., Knapp, G., Kuss, O., Higgins, J. P. T., Langan, D., Salanti, G. (2016). Methods to estimate the betweenstudy variance and its uncertainty in meta-analysis. Research Synthesis Methods, 7, 55–79.
  • Viechtbauer, W. (2005). Bias and efficiency of meta-analytic variance estimators in the random-effects model. Journal of Educational and Behavioral Statistics, 30, 261–293.
  • Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36, 1–48.
  • Viechtbauer, W., Cheung, M. W. L. (2010). Outlier and influence diagnostics for meta-analysis. Research Synthesis Methods, 1, 112–125.
  • *Ward, P., Zhou, H. (2006). Impact of information technology integration and lean/ just–in–time practices on lead–time performance. Decision Sciences, 37, 177–203.
  • Whitener, E. M. (1990). Confusion of confidence intervals and credibility intervals in meta-analysis. Journal of Applied Psychology, 75, 315–321.
  • Zupic, I., Čater, T. (2015). Bibliometric methods in management and organization. Organizational Research Methods, 18, 429–472.