Introducing Falcon-40B. A brand new language mannequin educated on 1000B tokens.
What’s included:
– 7B and 40B fashions made out there by TII
– surpasses LLaMA 65B and different fashions like MPT and RedPajama on the Open LLM Leaderboard
– structure is optimized for inference, with FlashAttention and multiquery
– Instruct mannequin out there
– license permits private and analysis use and industrial use with limitations
https://huggingface.co/tiiuae/falcon-40b
β€οΈ If you wish to assist the channel β€οΈ
Assist right here:
Patreon – https://www.patreon.com/1littlecoder/
Ko-Fi – https://ko-fi.com/1littlecoder