A Systematic Literature Review in Cross-browser Testing

Authors

  • Leandro N. Sabaren Departamento de Informática. Facultad de Ciencias Exactas y Naturales y Agrimensura. Universidad Nacional del Nordeste. Corrientes, Argentina
  • Maximiliano A. Mascheroni Departamento de Informática. Facultad de Ciencias Exactas y Naturales y Agrimensura. Universidad Nacional del Nordeste. Corrientes, Argentina
  • Cristina L. Greiner Departamento de Informática. Facultad de Ciencias Exactas y Naturales y Agrimensura. Universidad Nacional del Nordeste. Corrientes, Argentina
  • Emanuel Irrazábal Departamento de Informática. Facultad de Ciencias Exactas y Naturales y Agrimensura. Universidad Nacional del Nordeste. Corrientes, Argentina

DOI:

https://doi.org/10.24215/16666038.18.e03

Keywords:

Cross-browser testing, Systematic literature review, Web application

Abstract

Many users access web pages from different browsers looking for the same user experience in all of them. However, there are several causes that produce compatibility issues. Those defects affect functionalities and user interface components. In this paper we present a systematic literature review which aims to find and summarize existing techniques, tools and challenges related to cross-browser testing. According to the results, the most used technique is the visual analysis. However, there are still challenges to face. The most important challenge is the identification of dynamic components in the user interface. Cross-browser compatibility topics are getting importance according to an increment in published articles. Nevertheless, there are techniques that are not completely developed yet and do not fully support test automation practices.

Downloads

Download data is not yet available.

References

[1] E. Dustin, J. Rashka and D. McDiarmid, Quality Web Systems: Performance, Security, and Usability, Boston, Massachusetts, USA: Addison Wesley, 2001.
[2] V. S. Moustakis, C. Litos, A. Dalivigas and L. Tsironis, “Website Quality Assessment Criteria,” IQ, pp. 59-73, 2004.
[3] J. F. Kurose and K. W. Ross, Redes de computadoras. Un enfoque descendente, Pearson, 2010.
[4] “Introduction to cross browser testing – Learn web development | MDN”. Available: https://developer.mozilla.org/en-US/docs/Learn/ Tools_and_testing/Cross_browser_testing/Introduction. Accessed on 2017-04-03.
[5] L.N. Sabaren, M. A. Mascheroni, C. L. Greiner and E. Irrazábal, “Una Revisión Sistemática de la Literatura en Pruebas de Compatibilidad Web,” in XXIII Congreso Argentino de Ciencias de la Computación (CACIC 2017), pp. 812-821, La Plata, Argentina, 2017.
[6] B. Kitchenham and S. Charters, “Guidelines for performing Systematic Literature Reviews in Software Engineering,” Keele University and Durham University Joint Report, EBSE 2007-001, 2007.
[7] Z. Zakaria, R. Atan, A. A. A. Ghani and N. F. M. Sani, “Unit Testing Approaches for BPEL: A Systematic Review,” in Software Engineering Conference, APSEC '09. Asia-Pacific, pp. 316-322, Penang, Malaysia, 2009.
[8] E. Mendes, “A systematic review of Web engineering research,” in International Symposium on Empirical Software Engineering, pp. 498-507, Queensland, Australia, 2005.
[9] S. Dogan, A. Betin-Can and V. Garousi, “Web application testing: A systematic literature review,”Journal of Systems and Software, vol. 91, pp. 174-201, 2014.
[10]E. I. Nabil, “Specifications for Web Services Testing: A Systematic Review,” in 2015 IEEE World Congress on Services, pp. 152-159, New York, USA, 2015.
[11] S. Choudhary, H. Varsee and A. Orso, “A cross-browser web application testing tool,” in 2010 IEEE International Conference on Software Maintenance, pp. 1-6, Romania, 2010.
[12]S. R. Choudhary, H. Versee and A. Orso, “WEBDIFF: Automated Identification of Cross-browser Issues in Web Applications,” in 26th IEEE International Conference on Software Maintenance, pp. 1-10, Romania, 2010.
[13] S. Choudhary, “Detecting Cross-browser Issues in Web Applications,” in 2011 33rd International Conference on Software Engineering, pp. 1146-1148, Hawaii, USA, 2011.
[14]A. Mesbah and M. R. Prasad, “Automated Cross-Browser Compatibility Testing,” in 2011 33rd International Conference on Software Engineering, pp. 561-570, Hawaii, USA, 2011.
[15] J. G. Ochin, “Cross Browser Incompatibility: Reasons and Solutions,” International Journal of Software Engineering & Applications, July 2011, vol. Vol.2, nº No.3, pp. 66-77, 2011.
[16]S. Choudhary, M. Prasad and A. Orso, “CROSSCHECK: Combining Crawling and Differencing To Better Detect Cross-browser Incompatibilities in Web Applications,” in 2012 IEEE Fifth International Conference on Software Testing, Verification and Validation, pp. 171-180, Montreal, Quebec, Canada, 2012.
[17]V. Dallmeier, M. Burger, T. Orth and A. Zeller, “WebMate: A Tool for Testing Web 2.0 Applications,” in Workshop on JavaScript Tools, pp. 11-15, Beijing, China, 2012.
[18]A. Issa, J. Sillito and V. Garousi, “Visual Testing of Graphical User Interfaces: an Exploratory Study Towards Systematic Definitions and Approaches,” in 2012 14th IEEE International Symposium on Web Systems Evolution, pp. 11-15, Trento, Italy, 2012.
[19]A. Sivaji, N. A. Ramli, Z. M. Nor, N.-K. Chuan, F. Wan, A. Wan and S. Shi-Tzuaan, “Measuring and Improving Website User Experience using UX Methodologies: A Case Study on Cross Browser Compatibility Heuristic,” in Southeast Asian Network of Ergonomics Societies, pp. 1-6, Langkawi, Kedah, Malasia, 2012.
[20]S. Choudhary, M. Prasad and A. Orso, “X-PERT: Accurate identification of cross-browser issues in web applications,” in 2013 35th International Conference on Software Engineering, pp. 702-711, San Francisco, California, USA, 2013.
[21]V. Dallmeier, M. Burger, T. Orth and A. Zeller, “WebMate: Generating Test Cases for Web 2.0,” in International Conference on Software Quality, Software Quality. Increasing Value in Software and Systems Development, pp. 55-69, Vienna, Austria, 2013.
[22]N. Semenenko, M. Dumas and T. Saar, “Browserbite: Accurate Cross-Browser Testing via Machine Learning over Image Features,” in 2013 29th IEEE International Conference on Software Maintenance, pp. 528-531, Eindhoven, Netherlands, 2013.
[23]S. Choudhary, M. Prasad and A. Orso, “X-PERT: a web application testing tool for cross-browser inconsistency detection,” in 2014 International Symposium on Software Testing and Analysis, pp. 417-420, San Jose, California, USA, 2014.
[24] A. Deursen, A. Mesbah and A. Nederlof, “Crawl-based analysis of web applications: Prospects and challenges,” Science of Computer Programming, vol. 87, pp. 173-180, 2014.
[25]B. Kaalra and K. Gowthaman, “Cross Browser Testing Using Automated Test Tools,” International Journal of advanced studies in Computer Science and Engineering, vol. 3, nº 10, pp. 7-12, 2014.
[26]X. Li and H. Zeng, “Modeling web application for cross-browser compatibility testing,” in 2014 15th IEEE/ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing, pp. 1-5, Nevada, USA, 2014.
[27]S. Mahajan and W. Halfond, “Finding HTML presentation failures using image comparison techniques,” in 29th ACM/IEEE international conference on Automated software engineering, pp. 91-96, Vasteras, Sweden, 2014.
[28] E. Selay, Z. Q. Zhou and J. Zou, “Adaptive Random Testing for Image Comparison in Regression Web Testing,” in 2014 International Conference on Digital lmage Computing: Techniques and Applications, pp. 1-7, New South Wales, Australia, 2014.
[29]M. He, H. Tang, G. Wu and H. Zhong, “A Crowdsourcing framework for Detecting Cross-Browser Issues in Web Application,” in the 7th Asia-Pacific Symposium on Internetware, pp. 239-242, Wuhan, China, 2015.
[30]A. Hori, S. Takada, H. Tanno and M. Oinuma, “An oracle based on image comparison for regression testing of web applications,” in 27th International Conference on Software Engineering and Knowledge Engineering, pp. 639-645, Pittsburgh, USA, 2015.
[31]S. Mahajan and W. G. J. Halfond, “Detection and Localization of HTML Presentation Failures Using Computer Vision-Based Techniques,” in 2015 IEEE 8th International Conference on Software Testing, Verification and Validation, pp. 1-10, Graz, Austria, 2015.
[32]T. Saar, M. Dumas, M. Kaljuve and N. Semenenko, “Browserbite: cross-browser testing via image processing,” Software—Practice & Experience, vol. 46, nº 11, pp. 1459-1477, 2015.
[33]H. Shi and H. Zeng, “Cross-Browser Compatibility Testing Based on Model Comparison,” in 2015 International Conference on Computer Application Technologies, pp. 103-107, Matsue, Japan, 2015.
[34]S. Xu and H. Zeng, “Static Analysis Technique of Cross-Browser Compatibility Detecting,” in 2015 3rd International Conference on Applied Computing and Information Technology/2nd International Conference on Computational Science and Intelligence, pp. 103-107, Okayama, Japan, 2015.
[35]N. Barskar and C. Patidar, “A Survey on Cross Browser Inconsistencies in Web Application,” International Journal of Computer Applications, vol. 137, nº 4, pp. 37-41, 2016.
[36]M. He, G. Wu, H. Tang, W. Chen, J. Wei, H. Zhong and T. Huang, “X-Check: A Novel Cross-browser Testing Service based on Record/Replay,” in 2016 IEEE International Conference on Web Services, pp. 123-130, San Francisco, California, USA, 2016.
[37]S. Mahajan, B. Li, P. Behnamghader and W. G. J. Halfond, “Using Visual Symptoms for Debugging Presentation Failures in Web Applications,” in 2016 IEEE International Conference on Software Testing, Verification and Validation, pp. 191-201, USA, 2016.
[38]M. Sharma and C. P. Patidar, “An Automated Approach for Cross-Browser Inconsistency (XBI) Detection,” in 9th Annual ACM India Conference, pp. 141-145, India, 2016.
[39]G. Wu, M. He, H. Tang and J. Wei, “Detect Cross-Browser Issues for JavaScript-Based Web Applications Based on Record/Replay,” in 2016 IEEE International Conference on Software Maintenance and Evolution, pp. 78-87, Raleigh, North Carolina, USA, 2016.
[40] C. Patidar, M. Sharma and V. Sharda, “Detection of Cross Browser Inconsistency by Comparing Extracted Attributes,” International Journal of Scientific Research in Computer Science and Engineering, vol. 5, nº 1, pp. 1-6, 2017.
[41] M. Furkan Kıraç, B. Aktemur and H. Sözer, “VISOR: A Fast Image Processing Pipeline with Scaling and Translation Invariance for Test Oracle Automation of Visual Output Systems,” in Journal of Systems and Software, vol. 136, pp. 266-277, 2018.
[42] “W3C Document Object Model,” Available at: https://www.w3.org/DOM/. Accessed on 2017-05-22.
[43] “What Is Web 2.0 - O’Reilly Media,” Available at: http://www.oreilly.com/pub/a/web2/archive/ what-is-web-20.html. Accessed on2017-05-03.
[44]D. M. W. Powers, “Evaluation: From precision, recall and F-measure to ROC, informedness, markedness & correlation,”Journal of Machine Learning Technologies, vol. 2, nº 1, pp. 37-63, 2011.
[45]V. Garousi, A. Mesbah, A. Betin-Can and S. Mirshokraie, “A systematic mapping study of web application testing,” Information and Software Technology, vol. 55, nº 8, pp. 1374-1396, 2013.
[46]F. Paz and J. A. Pow-Sang, “Current Trends in Usability Evaluation Methods: A Systematic Review,” in 2014 7th International Conference on Advanced Software Engineering and Its Applications, pp. 11-15, Haikou, China, 2014.
[47]M. Al-Ismail and A. Sajeev, “Usability challenges in mobile web,” in 2014 IEEE International Conference on Communication, Networks and Satellite, pp. 50-55, Jakarta, Indonesia, 2014.
[48]M. Mascheroni, M. Cogliolo and E. Irrazábal, “Automatización de pruebas de compatibilidad web en un entorno de desarrollo continuo de software,” in Simposio Argentino de Ingeniería de Software - JAIIO 45, pp. 51-63, 2016.
[49]M. Mascheroni, M. Cogliolo and E. Irrazábal, “Automatic detection of Web Incompatibilities using Digital Image Processing,” Electronic Journal of Informatics and Operations Research, vol. 16, nº 1, pp. 29-45, 2017.
[50]G. Saleem, F. Azam, M. Younus, N. Ahmed and L. Yong, “Quality assurance of web services: A systematic literature review,” in 2016 2nd IEEE International Conference on Computer and Communications, pp. 1391-1396, Chengdu, China, 2016.
[51]V. Garousi, M. Felderer and T. Hacaloğlu, “What We Know about Software Test Maturity and Test Process Improvement,”IEEE Software, vol. 35, nº 1, pp. 84-92, 2017.
[52]X. Zhou, Y. Jin, H. Zhang, S. Li and X. Huang, “A Map of Threats to Validity of Systematic Literature Reviews in Software Engineering,” in 2016 23rd Asia-Pacific Software Engineering Conference, pp. 153-160, Hamilton, New Zealand, 2017.
[53]M. Jazayeri, “Some Trends in Web Application Development,” in Future of Software Engineering 2007, FOSE '07, pp. 199-213, Minneapolis, Minesota, USA, 2007.
[54]E. Kiciman and B. Livshits, “AjaxScope: a platform for remotely monitoring the client-side behavior of web 2.0 applications,” in Proceedings of twenty-first ACM SIGOPS symposium on Operating systems principles, Stevenson, pp. 17-30, Washington, USA, 2007.
[55]F. Ricca and P. Tonella, “Web testing: a roadmap for the empirical research,” in Seventh IEEE International Symposium on Web Site Evolution, pp. 63-70, Budapest, Hungary, 2005.
[56]R. Ramler, E. Weippl, M. Winterer, W. Schwinger and J. Altmann, “A Quality-Driven Approach to Web Testing,” in Ibero American Conference on Web Engineering, pp. 81-95, Santa Fé, Argentina, 2002.

Published

2018-04-25

How to Cite

Sabaren, L. N., Mascheroni, M. A., Greiner, C. L., & Irrazábal, E. (2018). A Systematic Literature Review in Cross-browser Testing. Journal of Computer Science and Technology, 18(01), e03. https://doi.org/10.24215/16666038.18.e03

Issue

Section

Original Articles