industry

Falcon 2: An 11B parameter pretrained language model and VLM, trained on over 5000B tokens and 11 languages (huggingface.co)

huggingface.co · 1 year ago · write a board post referencing this

login to comment.