Gkrispanis, KonstantinosKonstantinosGkrispanis2024-10-222024-10-222024-09-1835. Forum Bauinformatik, fbi 2024: 42-49https://hdl.handle.net/11420/49647Industrial products are initially designed using Computer-Aided Design (CAD) systems, which employ Boundary Representation (B-Rep) models for precision and efficiency. During the manufacturing process it is important to identify machining features in CAD models, as this enhances the ability to analyze, process and manufacture them. This study proposes a deep learning approach that employs transfer learning to enhance the identification and segmentation of these features. BRepNet, a novel neural network designed to operate directly on B-Reps, is adapted by integrating pre-trained weights and adding a modified segmentation head to refine feature identification. By freezing the backbone network and training only the segmentation head, the approach minimizes computational demands and accelerates training times. This makes deep learning more feasible for real-world applications requiring rapid deployment and high accuracy. Tested on the MFCAD dataset, the proposed architecture achieves near state-of-the-art accuracy, with 0.998 accuracy in feature identification for the version with the freezed backbone and 0.999 for the fully fine-tuned version. The study highlights the effectiveness of transfer learning in achieving high accuracy with fewer resources, demonstrating the potential for more efficient and robust CAD model analysis.enhttps://creativecommons.org/licenses/by/4.0/B-RepCAD learningCAD segmentationGeometric Deep LearningMachining featuresComputer Science, Information and General Works::006: Special computer methods::006.3: Artificial IntelligenceTechnology::620: EngineeringComputer Science, Information and General Works::004: Computer SciencesComputer Science, Information and General Works::005: Computer Programming, Programs, Data and SecurityEnhancing Machining Feature Recognition in CAD Models through Transfer Learning on BRepNetConference Paper10.15480/882.1353910.15480/882.13539Conference Paper