microsoft/phi-1_5

microsoft/phi-1_5 is a 1.3 billion parameter Transformer-based language model developed by Microsoft. Trained on a curated dataset including NLP synthetic texts, it demonstrates strong performance in common sense, language understanding, and logical reasoning among small models. This base model is designed for research into AI safety challenges, offering capabilities in text generation, summarization, and Python code creation.

Warm
Public
1.4B
BF16
2048
License: mit
Hugging Face

No reviews yet. Be the first to review!