LATEST NEWS   Govt maintains RM300 BUDI Diesel aid for April, distribution starts April 8, benefitting 340,000 recipients with allocation RM102 million - MOF | Lorry driver killed in crash involving two tanker lorries at Km40.6 of the PLUS Highway this afternoon - Fire Dept | MADANI Govt will continue to act proactively based on data, current reality so the measures taken are comprehensive, balanced and effective in tackling global energy crisis - PM Anwar | SPM 2025: MRSM records GPM index of 2.067, with 100 per cent of candidates eligible to receive exam certificate - DPM Ahmad Zahid | SPM 2025: DPM Ahmad Zahid is proud that four Orang Asli students obtained 9As, 114 others scored 5As and above | 

TII Unveils Falcon Arabic, Falcon-H1 To Advance AI Performance And Accessibility

KUALA LUMPUR, May 22 (Bernama) -- Technology Innovation Institute (TII), the applied research arm of Abu Dhabi’s Advanced Technology Research Council (ATRC), has introduced two significant artificial intelligence (AI) models—Falcon Arabic and Falcon-H1—marking a major leap in regional language modelling and efficient AI design.

Falcon Arabic, built on top of Falcon 3-7B (seven-billion-parameter), is the first Arabic language model in the Falcon series and has been recognised as the best-performing Arabic AI model in the region.

Trained on a high-quality native Arabic dataset spanning Modern Standard Arabic and regional dialects, it ranks above all other regional models on the Open Arabic LLM Leaderboard. Impressively, it performs at the level of models up to 10 times its size, demonstrating the strength of its architecture over mere scale.

Alongside Falcon Arabic, TII in a statement said Falcon-H1 was unveiled as a hybrid architecture model that redefines the balance between performance and portability.

Targeted at the small-to-medium model category (30–70 billion parameters), Falcon-H1 outperforms comparable models from Meta’s LLaMA and Alibaba’s Qwen, enabling high-performance AI in low-resource settings and even on edge devices. Its multilingual tokeniser supports over 100 languages, further extending its global utility.

Developed to be smarter, more efficient, and easier to deploy, Falcon-H1 blends the strengths of Transformer and Mamba architectures, achieving faster inference speeds and lower memory use. Its flexible design comes in a range of sizes—from 500 million to 34 billion parameters—making it suitable for everything from mobile devices to enterprise-level applications.

Falcon models are already seeing real-world application. In collaboration with the Bill & Melinda Gates Foundation, Falcon supported AgriLLM, an AI solution that helps farmers respond to climate challenges. Globally, Falcon models have been downloaded over 55 million times and remain the most prominent open-source AI ecosystem emerging from the Middle East.

All Falcon models are open source under the TII Falcon License (based on Apache 2.0) and are available via Hugging Face and the official Falcon website, supporting ethical and accessible AI innovation for developers, researchers, and institutions worldwide.

-- BERNAMA