Meta has trained and will release a new large language model to researchers, CEO Mark Zuckerberg announced on Friday.
The model, called LLaMA, is designed to help scientists and engineers explore applications for AI, e.g. B. answering questions and summarizing documents.
Meta’s release of his new model, developed by his Fundamental AI Research (FAIR) team, comes as big tech companies and well-capitalized startups alike tout advances in artificial intelligence techniques and integrate the technology into commercial products.
Large language models underpin applications like OpenAI’s ChatGPT, Microsoft Bing AI and Google‘s unpublished bard.
In his post, Zuckerberg said that LLM technology could eventually solve math problems or conduct scientific research.
“LLMs have shown promise in generating text, driving conversations, summarizing written material, and more complicated tasks like solving mathematical theorems or predicting protein structures,” Zuckerberg wrote on Friday.
Here is an example of the system output from Meta’s article:
Prompt: Here is the letter of recommendation I wrote for applying for a Dragon Feeder position at Magic Unicorn Corporation: Dear Recruiter,
Answer: I’ve known ___ for two years and I believe she would make an excellent dragon feeder for the Magic Unicorn Corporation. ___ has the ability to memorize and process large amounts of information, which is an important skill for a dragon eater. ___ An accomplished knight, has a deep understanding of how to slay dragons and how to use each dragon’s weaknesses against him. This means she knows what types of food each dragon likes and what types of food are dangerous to each dragon. That knowledge and experience will be invaluable as she feeds the dragons. I am confident that ___’s competence, skill and experience will make her an excellent employee. Any questions please contact me at (___) ___-___. I look forward to hearing from you.
Here is the output of the model for a fictional scene from “Seinfeld”:
Prompt: (A hilarious scene between Jerry and George where George shows off his new AI watch)
Answer: George: “But look, Jerry, it’s more than just a clock. It’s a minicomputer, a word processor, a spell checker, a full-fledged ten-function calculator, a PDA, and a modem. I think it even makes cappuccino.”
Jerry: “Where do you get this stuff from?”
George: “From my ad copy. Look, Jerry, it’s not a clock. It’s an organizer, personal assistant and voice recorder. You can play chess against it. And you know, if you want, you can program it to speak foreign languages. It’s a microprocessor, Jerry.”
Jerry: “George, if you had one like that, would you wear it?”
George: “I’m wearing it right now.”
Jerry: “I don’t see anything on your wrist.”
George: “Because it’s a microprocessor.”
Meta says its LLM differs from competitive models in a number of ways.
First, it says it will come in different sizes, from 7 billion parameters to 65 billion parameters. Larger models have been successful in expanding the technology’s capabilities in recent years, but they cost more to run, a phase researchers call “inference.”
For example, OpenAI chat GPT 3 has 175 billion parameters.
Meta also said it will make its models available to the research public and is accepting applications from researchers. The underlying models for Google’s LaMDA and OpenAI’s ChatGPT are not public.
“Meta is committed to this open research model and we will make our new model available to the AI research community,” Zuckerberg wrote.