Skip to yearly menu bar Skip to main content


Poster

OWL: A Large Language Model for IT Operations

Hongcheng Guo · Jian Yang · Jiaheng Liu · Liqun Yang · Linzheng Chai · Jiaqi Bai · Junran Peng · Xiaorong Hu · Chao Chen · Dongfeng Zhang · xu Shi · Tieqiao Zheng · liangfan zheng · Bo Zhang · Ke Xu · Zhoujun Li

Halle B #74

Abstract:

With the rapid advancement of IT operations, managing and analyzing large data volumes efficiently for practical applications has become increasingly critical. Natural Language Processing (NLP) techniques have demonstrated remarkable capabilities in various tasks, including named entity recognition, machine translation, and dialogue systems. Recently, Large Language Models (LLMs) have achieved significant improvements across various domain-specific areas. However, there is a noticeable gap in the development of specialized Large Language Models (LLMs) tailored for IT operations. In this paper, we introduce the OWL, a large language model trained on our constructed Owl-Instruct with a wide range of IT-related information. Specifically, limited by the maximum input length, we propose the \textbf{H}omogeneous \textbf{M}arkov \textbf{C}ontext \textbf{E}xtension method (HMCE). The mixture-of-adapter strategy is leveraged to improve the parameter-efficient tuning across different domains or tasks.Further, we evaluate the performance of OWL on the Owl-Bench established by us and open IT-related benchmarks. OWL demonstrates superior performance results on IT tasks, which outperforms existing models by significant margins. Moreover, we hope that the findings of our work will provide more insights to revolutionize the techniques of IT operations with specialized LLMs.

Chat is not available.