Mozilla is looking to deploy its roughly $1.4 billion in reserves to support "mission driven" companies and nonprofits, and is particularly focused on AI.
Let me know when they create an alternative to transformers, pytorch, and create a tokenizer without the Open AI QKV hidden layers of alignment. That is where all the fuckery is happening. This is standard and used for all models. It is totally proprietary and a primary reason why they are “open weights” but not Open Source.
Let me know when they create an alternative to transformers, pytorch, and create a tokenizer without the Open AI QKV hidden layers of alignment. That is where all the fuckery is happening. This is standard and used for all models. It is totally proprietary and a primary reason why they are “open weights” but not Open Source.