| IBM Launches Granite Version 4.0 and Granite-Docling |
| Written by Nikos Vaggalis |
| Thursday, 23 October 2025 |
|
IBM has launched Granite 4.0, the next generation of open-source, small but efficient, IBM language models, together with Granite-Docling, the next gen document format converter. Eighteen months ago in IBM Launches The Granite Code LLM Series we witnessed the beginnings of IBM's decoder-only models suitable for code generation tasks, trained on code written in 116 programming languages and ranging in size from 3 to 34 billion parameters. Being benchmarked on HumanEvalPack, HumanEvalPlus, and RepoBench, showed strong performances on code synthesis, fixing, explanation, editing, and translation, across most major programming languages, including Python, JavaScript, Java, Go, C++, and Rust. Then in October 2024 we reported on the Granite 3.0 collection of generative AI models, which included a new, instruction-tuned, dense decoder-only LLM. Now the IBM empire has followed up with Granite 4.0, a new generation of its open-source language models designed to run faster, cost-efficiently and with stronger safeguards. Instead of following in the steps of OpenAI or Meta, chasing behind their trillion parameters heavy weight models, Granite opted for fewer parameters but still great performance great due to its hybrid Mamba/transformer architecture that reduces memory requirements by 70% or more. That means that the models can run on lower spec GPUs. The available models differ in parameter size and architecture. There's:
In contrast to the earlier Granite versions, Granite 4.0 is not just trained on code, but on samples drawn from a carefully compiled 22T-token corpus of enterprise-focused training data, synthetic and open datasets across domains that include language, code, math , reasoning, and so on. Furthermore, it’s also the first open-weight model family certified under ISO 42001, the first international standard for an Artificial Intelligence Management System. Implementing this standard means putting in place policies and procedures for the sound governance of an organization in relation to AI. In other words it means that this open source model adheres to high standards and can be used in an enterprise setting in production. Granite 4.0 models are available on IBM watsonx.ai, Docker Hub, Hugging Face, Kaggle etc. Amazon SageMaker JumpStart. Microsoft Azure AI Foundry is coming soon. But that's not all. Together with the Granite models, IBM also released Granite-Docling (Granite-Docling-258M official) as a new: ultra-compact vision language model which unlocks the content—preserving structure while converting text, equations, code and tables into clean, machine-readable formats. It could be considered as a strong competitor to Docling, the gold standard for parsing documents and exporting them to the desired format since it can read PDF, DOCX, PPTX, XLSX, Images, HTML, AsciiDoc & Markdown and exports them to to HTML, Markdown and JSON with embedded and referenced images for preparing them for gen AI uses. But although they differ in purpose, they're both developed by IBM's Deep Search department. What Granite-Docling does differently, is that instead of converting documents directly to Markdown, hence losing a lot of information, Granite-Docling translates the complex structural elements to DocTags, a markup language created specifically for this purpose. The DocTags format instead of throwing information away which happens at the conversion step to Markdown, it can describe relationships between the different elements of a document, such as charts, tables, forms, code and equations, and encode them in DocTags. Users can then easily convert that format to Markdown, HTML, JSON or plain text. No need to mention the usefulness of that for building enterprise level RAG applications.
More InformationGranite-Docling’s Hugging Face model card.
Related ArticlesIBM Launches The Granite Code LLM Series
To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.
Comments
or email your comment to: comments@i-programmer.info |
| Last Updated ( Thursday, 23 October 2025 ) |


