Member-only story

RAGs, Small Language Models, and Domain-Specific AI

A New Frontier in Practical Applications

Ryan M. Raiker, MBA
10 min readDec 10, 2024

--

The buzz surrounding generative AI and large language models (LLMs) has dominated headlines, but a quieter revolution is brewing in the form of RAGs (Retrieval-Augmented Generation), small language models, and their use in domain-specific applications. While LLMs like GPT-4 capture imaginations with their broad capabilities, these emerging technologies demonstrate that smaller, more focused models can often yield greater utility in real-world scenarios.

In this article, I’ll dive deeper into what these technologies mean for industries, why they’re game-changers, and how they can be applied to address specific challenges. Whether you’re a tech enthusiast or a business leader looking for practical solutions, there’s a lot to get excited about.

What Are Retrieval-Augmented Generation (RAGs) Models?

RAGs represent a clever marriage of two complementary ideas: a language model’s ability to generate human-like text and the efficiency of a retrieval system that fetches information from external sources.

Think of RAGs as the perfect assistant — imagine sitting down with the ultimate assistant — one that not only has a deep understanding of the world but can also…

--

--

Ryan M. Raiker, MBA
Ryan M. Raiker, MBA

Written by Ryan M. Raiker, MBA

TEDxSpeaker | Consultant | Process Expert | Marketer | Digital Guy | Adjunct Professor & Learner. I write about tech, marketing, business, and more.

Responses (1)