Abstract
This scholarly article conducts a comparative evaluation of prominent large-scale language models, specifically encompassing Google’s BARD, ChatGPT 3.5, and ChatGPT 4. It offers a comprehensive dissection of each model, elucidating aspects such as architectural structure, utilized training data, and proficiency in natural language processing. The paper also undertakes a systematic assessment of these models in executing diverse tasks like text generation, text completion, and question-answering, paralleling their precision, processing speed, and efficiency. The concluding part of the article explores the prospective application domains of these language models, including but not limited to customer service, education, and healthcare sectors, while proffering suggestions for future research trajectories and technological enhancements. In summary, the study furnishes critical insights into the current status of large-scale language models, shedding light on their potential to augment human-computer interaction and drive forward advancements in the realm of natural language processing technology.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
ChatGPT - Wikipedia (n.d.). https://en.wikipedia.org/wiki/ChatGPT. Accessed 27 Apr 2023
Peters, J.: The Bing AI bot has been secretly running GPT-4 - The Verge (2023). https://www.theverge.com/2023/3/14/23639928.gpt_3_1. Accessed 7 Apr 2023
Auto-GPT - The next evolution of data driven Chat AI. https://auto-gpt.ai/
Auto-GPT: An Autonomous GPT-4 Experiment - GitHub. https://github.com/Significant-Gravitas/Auto-GPT
What is Auto-GPT? Everything to know about the next powerful AI tool. https://www.zdnet.com/article/what-is- auto-gpt-everything-to-know-about-the-next-powerful-ai-tool/
Meta AI: Introducing LLaMA: A foundational, 65-billion-parameter language model (2023). https://ai.facebook.com/blog/large-language-model-llama-meta-ai/. Accessed 27 Apr 2023
Hugging Face. LLaMA - Hugging Face (2023). https://huggingface.co/docs/transformers/main/en/model_doc/llama. Accessed 27 Apr 2023
Touvron, F.N., et al.: LLaMA: Open and Efficient Foundation Language Models (2023). https://arxiv.org/abs/2302.13971. Accessed 27 April 2023
Vicuna. Vicuna: An Open-Source Chatbot Impressing GPT-4 with 90 ChatGPT Quality (2023). https://vicuna.lmsys.org/. Accessed 27 Apr 2023
Search Engine Journal. Google Bard: Everything You Need To Know - Search Engine Journal (2023). https://www.searchenginejournal.com/google-bard/482860/. Accessed 27 Apr 2023
Android Police. Google Bard: Everything you need to know about the AI chatbot (2023). https://www.androidpolice.com/google-bard-explained/. Accessed 27 Apr 2023
Tech Trends. Google Unveils Bard: The New AI Language Model Rival to GPT-3 by OpenAI (2023). https://techtrends.co.in/google/google-unveils-bard-the-new-ai-language-model-rival-to-openais-gpt-3/. Accessed 27 Apr 2023
Google. Google AI updates: Bard and new AI features in Search - The Keyword (2023). https://blog.google/technology/ai/bard-google-ai-search-updates/. Accessed 27 Apr 2023
Medium. How To Fine-Tune the Alpaca Model For Any Language - Medium (2023). https://medium.com/@martin-thissen/how-to-fine-tune-the-alpaca-model-for-any-language-chatgpt-alternative-370f63753f94. Accessed 27 Apr 2023
TheBloke. TheBloke/vicuna-13B-1.1-HF \(\cdot \) Hugging Face (2023). https://huggingface.co/TheBloke/vicuna-13B-1.1-HF. Accessed 27 Apr 2023
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kumar, V. et al. (2024). Large-Language-Models (LLM)-Based AI Chatbots: Architecture, In-Depth Analysis and Their Performance Evaluation. In: Santosh, K., et al. Recent Trends in Image Processing and Pattern Recognition. RTIP2R 2023. Communications in Computer and Information Science, vol 2027. Springer, Cham. https://doi.org/10.1007/978-3-031-53085-2_20
Download citation
DOI: https://doi.org/10.1007/978-3-031-53085-2_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-53084-5
Online ISBN: 978-3-031-53085-2
eBook Packages: Computer ScienceComputer Science (R0)