Technologies
Services
Business
Devices
SoftBank announced that its generative AI foundation model for the telecommunications industry, the Large Telecom Model (LTM), achieved a top-tier overall ranking*1 in the GSMA Open-Telco LLM Benchmarks, which evaluate the performance of large language models (LLMs) specialized for the telecommunications sector.
As part of its ongoing efforts to advance LTM, SoftBank established a telecom-specialized learning framework built upon the data assets and operational expertise accumulated through its experience as a telecommunications operator. This top-tier recognition reflects the effectiveness of SoftBank’s learning framework in steadily improving telecom-domain performance from pre-trained models.
As of March 30, 2026. Ranking is based on the overall evaluation derived from the average scores across all evaluation datasets among all 84 models submitted to the GSMA Open-Telco LLM Benchmarks.
Under the Open Telco AI initiative launched at MWC Barcelona 2026, the GSMA is building an open, collaborative foundation for telco-grade AI, bringing together operators, vendors, developers and academia and providing a shared portal for models, datasets, compute and tools. The GSMA Open-Telco LLM Benchmarks are intrinsic to this effort because they provide the transparent evaluation layer that allows the ecosystem to measure, compare and improve model performance on telecom-specific, real-world tasks — helping move the industry beyond generic “frontier” capabilities toward accuracy and reliability that telecom networks require.
Enhancing LTM through a telecom-specialized learning framework
In developing generative AI foundation models, simply applying a general-purpose LLM is insufficient to achieve performance at a level suitable for real-world deployment. In the telecommunications domain, models must not only understand complex technical specifications but also perform domain-specific question answering and interpret operational logs, which requires a deep understanding of sophisticated technical systems and telecom-specific contexts. Accordingly, a structured learning framework specialized for the telecommunications industry is essential.
To meet these requirements, SoftBank established a telecom-specialized learning framework (the “Framework”) for LTM, systematizing both data design and training processes while taking into account the complex structures and interdependencies unique to telecommunications network data.
The Framework utilizes telecommunications-related public datasets as well as proprietary datasets held by SoftBank, including network data and expertise in network design, management, and operations. LTM is enhanced through staged additional training that integrates continual pre-training*2, fine-tuning, and reinforcement learning*3.
In addition, telecom domain data includes not only conventional document-based materials but also tabular data and code-based descriptions in diverse formats. These datasets are reorganized and transformed into synthetic data optimized for each training stage and objective. Furthermore, SoftBank improves training data quality through LLM-based data filtering and enhances learning efficiency and model performance through hyperparameter optimization (HPO)*4 using small language models (SLMs).
Through this Framework, SoftBank has established a foundation that enables reproducible improvement of telecom-domain performance while flexibly adapting to data expansion, updates to base models, and evolving business requirements.
Achieving a top-tier ranking in the GSMA Open-Telco LLM Benchmarks
Through the advancement enabled by this Framework, LTM achieved a top-tier overall ranking in the GSMA Open-Telco LLM Benchmarks, led by the GSMA, the global telecommunications industry organization.
The GSMA Open-Telco LLM Benchmarks evaluate LLM performance across multiple datasets reflecting real-world telecommunications operations, including telecom specification comprehension, domain-specific question answering, operational log interpretation, mathematical reasoning in telecom contexts, and configuration description. Rankings are determined based on the average scores across all evaluation dimensions.
This achievement demonstrates the effectiveness of LTM as a telecom-specialized model leveraging the data assets and operational expertise accumulated by SoftBank as a telecommunications operator. It also highlights the international recognition of SoftBank’s capabilities and technological expertise in developing generative AI foundation models.
SoftBank will continue to promote the full-scale internal adoption of LTM to reduce reliance on individual expertise in network operations, alleviate operational workloads, and improve operational efficiency, thereby contributing to the delivery of higher-quality mobile network services.
Ryuji Wakikawa, Vice President, Head of the Research Institute of Advanced Technology at SoftBank
SoftBank has been developing a telecom industry–specific LTM, trained on telecom expertise and real operational data, and achieved top-class results in the GSMA Open-Telco LLM Benchmarks. This demonstrates that our training foundation is also at a high international standard. By leveraging LTM, SoftBank will thoroughly enhance its operations and lead the advancement of the telecommunications industry.
Louis Powell, Director of AI initiatives, GSMA
Telecom networks demand precision and context that general-purpose AI often struggles to deliver. By testing models against telecom-relevant datasets and tasks, the GSMA Open-Telco LLM Benchmarks spotlight genuine capability improvements. SoftBank’s top-tier ranking is a strong example of that progress, and exactly the kind of momentum the industry needs as it scales AI responsibly into operations.
Ray Sharma is an Industry Analyst and Editor at The Fast Mode. He has over 15 years of experience in mobile broadband technologies and solutions, conducting research and analysis on various technology segments and producing articles and write-ups on the latest developments within the sector. He is also in charge of social media engagement and industry liaisons.
Follow him on LinkedIn or Facebook. He can be reached at ray.sharma@thefastmode.com
PREVIOUS POST
NEXT POST
2025 Winners of The Fast Mode Awards
COPYRIGHT © 2013 – 2025 THE FAST MODE
