Abstract
Purpose
The aim of this study is to offer valuable insights to businesses and facilitate better understanding on transformer-based models (TBMs), which are among the widely employed generative artificial intelligence (GAI) models, garnering substantial attention due to their ability to process and generate complex data.
Design/methodology/approach
Existing studies on TBMs tend to be limited in scope, either focusing on specific fields or being highly technical. To bridge this gap, this study conducts robust bibliometric analysis to explore the trends across journals, authors, affiliations, countries and research trajectories using science mapping techniques – co-citation, co-words and strategic diagram analysis.
Findings
Identified research gaps encompass the evolution of new closed and open-source TBMs; limited exploration across industries like education and disciplines like marketing; a lack of in-depth exploration on TBMs’ adoption in the health sector; scarcity of research on TBMs’ ethical considerations and potential TBMs’ performance research in diverse applications, like image processing.
Originality/value
The study offers an updated TBMs landscape and proposes a theoretical framework for TBMs’ adoption in organizations. Implications for managers and researchers along with suggested research questions to guide future investigations are provided.