×
Microsoft’s Latest Small Model Phi-3 Released
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Tiny but Mighty: The Phi-3 Small Language Models with Big Potential

Sometimes the best solutions come from unexpected places. That’s the lesson Microsoft researchers learned when they developed a new class of small language models (SLMs) that pack a powerful punch.

The Case in Point: Large language models (LLMs) have opened up exciting new possibilities for AI, but their massive size means they require significant computing resources. Microsoft’s researchers set out to create SLMs that offer many of the same capabilities as LLMs, but in a much smaller and more accessible package.

  • The researchers trained the Phi-3 family of SLMs on carefully curated, high-quality datasets, allowing them to outperform models of similar size and even larger models across a variety of benchmarks.
  • The first Phi-3 model, Phi-3-mini, has 3.8 billion parameters and performs better than models twice its size.

Go Deeper: The key to the Phi-3 models’ success was the researchers’ innovative approach to data selection and curation. Inspired by how children learn language, they built datasets focused on high-quality, educational content rather than relying on raw internet data.

  • The “TinyStories” dataset, for example, was created by prompting a large language model to generate millions of short stories using a limited vocabulary.
  • The “CodeTextbook” dataset was built by carefully selecting and filtering publicly available information to capture a wide scope of high-quality, textbook-like content.

Why It Matters: SLMs like the Phi-3 models offer significant advantages over their larger counterparts. They can run on devices at the edge, minimizing latency and maximizing privacy, and are more accessible for organizations with limited resources.

  • SLMs are well-suited for tasks that don’t require extensive reasoning or a quick response, such as summarizing documents, generating marketing content, or powering customer support chatbots.
  • By keeping data processing local, SLMs can enable AI experiences in areas with limited connectivity, opening up new possibilities for applications like crop disease detection for farmers.

The Big Picture: While LLMs will remain the gold standard for complex tasks, Microsoft envisions a future where a portfolio of models, both large and small, work together to solve a wide range of problems.

  • SLMs and LLMs can complement each other, with LLMs acting as routers to direct certain queries to the more lightweight SLMs when appropriate.
  • This flexible approach allows organizations to choose the right-sized model for their specific needs and resources, unlocking the power of AI for a broader range of users and use cases.

The Bottom Line: By developing the Phi-3 family of small language models, Microsoft has demonstrated that size isn’t everything when it comes to AI. These innovative SLMs offer a glimpse into a future where the benefits of powerful language models are more accessible and widely applicable, empowering more people to harness the potential of AI.

Microsoft's Phi-3 small language models has big potential

Recent News

Time Partners with OpenAI, Joining Growing Trend of Media Companies Embracing AI

Time partners with OpenAI, joining a growing trend of media companies leveraging AI to enhance journalism and expand access to trusted information.

AI Uncovers EV Adoption Barriers, Sparking New Climate Research Opportunities

Innovative AI analysis reveals critical barriers to electric vehicle adoption, offering insights to accelerate the transition.

AI Accelerates Disease Diagnosis: Earlier Detection, Novel Biomarkers, and Personalized Insights

From facial temperature patterns to blood biomarkers, AI is enabling earlier detection and uncovering novel indicators of disease.