Microsoft Releases Compact Language AI Model Phi-2
-
Microsoft Research announced its Phi-2 Small Language Model (SML) on Tuesday local time, which is a text-to-text artificial intelligence program that, according to a post on X platform, is 'small enough to run on laptops or mobile devices'.
Phi-2, with 2.7 billion parameters, performs comparably to other larger models, including Meta's Llama 2-7B (with 7 billion parameters) and Mistral-7B (another 7 billion parameter model).
Microsoft Research also noted in its blog post announcing Phi-2 that despite having fewer parameters than Google's new Gemini Nano 2 model (which has over 5 billion parameters), Phi-2 outperforms it and shows less toxicity and bias in responses compared to Llama 2.
Microsoft also took the opportunity to make a slight dig at Google's Gemini model, which showcased its ability to solve relatively complex physics problems and even correct student errors in a demo video of its upcoming largest and most capable new AI model, Gemini Ultra. It turns out that although Phi-2's scale may only be a fraction of Gemini Ultra's, it can also answer questions correctly and correct students using the same prompts.
However, despite these encouraging findings, Phi-2 currently has a major limitation: according to Microsoft's research license, it is only for 'research purposes only' and not for commercial use. The license further stipulates that Phi-2 can only be used for 'non-commercial, non-revenue-generating research purposes.' Therefore, businesses hoping to build products on top of it are temporarily out of luck.