Due to the problems of other existing models, such as the inability to extract the weights of the key parts of the text, the model classification is inaccurate, and it is difficult to adapt to the heavy work environment in the space text classification work. Therefore, based on the fusion of the BERT pre-training model and the LSTM neural network model, we combine the multi-feature embedding and multi-network fusion methods to construct the BERT-LSTM model, using the BERT model to convert the input text into word vectors. Then, the word vectors of the text sequence are concatenated into a matrix, and different sizes of convolution kernels are used for convolution operations. The obtained maximum features are combined into a feature vector set, which is then input into the Bi- LSTM layer for sequence modeling. Self attention is used to capture key information in the global information, further improving the weight of key features in text classification. Comparison tests are conducted with TextCNN, TextRNN, DPCNN and other models for aerospace text categorization task, and the results show that the proposed model based on bi-directional long and short-term memory networks fused with the attention mechanism improves the accuracy by 25.3%, 25.8%, and 18.4% compared with the other models on aerospace text categorization task, respectively.