Abstract:
A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning appro...Show MoreMetadata
Abstract:
A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning approach that has achieved excellent results in many complex tasks, the same as classification, Natural Language Processing (NLP), prediction, etc. This survey paper mainly adopts the summary of BERT, its multiple types, and its latest developments and applications in various computer science and engineering fields. Furthermore, it puts forward BERT's problems and attractive future research trends in a different area with multiple datasets. From the findings, overall, the BERT and their recent types have achieved more accurate, fast, and optimal results in solving most complex problems than typical Machine and Deep Learning methods.
Published in: 2023 20th Learning and Technology Conference (L&T)
Date of Conference: 26-26 January 2023
Date Added to IEEE Xplore: 11 April 2023
ISBN Information: