Itinai.com ai development knolling flat lay high tech busines 04352d65 c7a1 4176 820a a70cfc3b302f 2
Itinai.com ai development knolling flat lay high tech busines 04352d65 c7a1 4176 820a a70cfc3b302f 2

Microsoft AI Releases Phi-3 Family of Models: A 3.8B Parameter Language Model Trained on 3.3T Tokens Locally on Your Phone

 Microsoft AI Releases Phi-3 Family of Models: A 3.8B Parameter Language Model Trained on 3.3T Tokens Locally on Your Phone

Introducing Microsoft’s Phi-3 Family of Models

Microsoft has developed the Phi-3 family of language models, with the Phi-3-mini model being a standout. This model, with 3.8 billion parameters, is trained on enhanced datasets exceeding 3.3 trillion tokens. Despite its smaller size, it facilitates local inference on contemporary smartphones, making it a practical and accessible solution.

Practical Solutions and Value

The Phi-3-mini model offers practical solutions for language understanding and reasoning, comparable to larger models, while being optimized for mobile devices. It can be quantized to 4 bits, occupying approximately 1.8GB of memory and achieving over 12 tokens per second on an iPhone 14 with the A16 Bionic chip. This makes it a valuable tool for various language tasks, especially when storage and processing power are limited.

Furthermore, the model’s training methodology emphasizes data quality over computational efficiency, resulting in enhanced performance. Post-training involves supervised instruction fine-tuning and preference tuning, enhancing the modelโ€™s chat capabilities, robustness, and safety.

While the Phi-3-mini model showcases the potential for smaller models to achieve comparable performance to larger counterparts, it also highlights the need for further exploration into multilingual capabilities and augmentation with search engines to enhance its effectiveness in addressing diverse language tasks.

For companies looking to evolve with AI, Microsoft’s Phi-3 Family of Models offers practical solutions for language tasks, especially in scenarios with limited storage and processing power. It demonstrates the potential for smaller models to achieve comparable performance to larger counterparts, with a focus on practicality and accessibility.

List of Useful Links:

Itinai.com office ai background high tech quantum computing 0002ba7c e3d6 4fd7 abd6 cfe4e5f08aeb 0

Vladimir Dyachkov, Ph.D
Editor-in-Chief itinai.com

I believe that AI is only as powerful as the human insight guiding it.

Unleash Your Creative Potential with AI Agents

Competitors are already using AI Agents

Business Problems We Solve

  • Automation of internal processes.
  • Optimizing AI costs without huge budgets.
  • Training staff, developing custom courses for business needs
  • Integrating AI into client work, automating first lines of contact

Large and Medium Businesses

Startups

Offline Business

100% of clients report increased productivity and reduced operati

AI news and solutions