IEEE International Conference on Big Data

CyBERT: Contextualized Embeddings for the Cybersecurity Domain

, , , and

We present CyBERT, a domain-specific Bidirectional Encoder Representations from Transformers (BERT) model, fine-tuned with a large corpus of textual cybersecurity data. State-of-the-art natural language models that can process dense, fine-grained textual threat, attack, and vulnerability information can provide numerous benefits to the cybersecurity community. The primary contribution of this paper is providing the security community with an initial fine-tuned BERT model that can perform a variety of cybersecurity-specific downstream tasks with high accuracy and efficient use of resources. We create a cybersecurity corpus from open-source unstructured and semi-unstructured Cyber Threat Intelligence (CTI) data and use it to fine-tune a base BERT model with Masked Language Modeling (MLM) to recognize specialized cybersecurity entities. We evaluate the model using various downstream tasks that can benefit modern Security Operations Centers (SOCs). The fine-tuned CyBERT model outperforms the base BERT model in the domain-specific MLM evaluation. We also provide use-cases of CyBERT applications in cybersecurity-based downstream tasks.

This material is based upon work supported by a grant from NSA and from the National Science Foundation Grant No. 2114892.


  • 1030673 bytes

bert, cybersecurity, llm

InProceedings

IEEE

IEEE

DOI: 10.1109/BigData52589.2021.9671824

Downloads: 1688 downloads

UMBC ebiquity