Jump to content

Talk:Llama (language model)

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

opensource

[edit]

I dont get it why there is the GPLv3 reference and the claim on opensource, since it clearly is *NOT* opensource as can be seen on the license https://github.com/facebookresearch/llama/blob/main/LICENSE Bertboerland (talk) 20:40, 30 November 2023 (UTC)[reply]

I've removed this article from Category:Open-source artificial intelligence, as Llama is source-available but its license has restrictions that prevent it from being open-source, per sources such as Ars Technica. — Newslinger talk 21:07, 8 December 2024 (UTC)[reply]

restored 4chan references

[edit]

@DIYeditor: per your indication at metawiki, I have restored a version of this article to one that includes the 4chan links. Any subsequent edits may need to be redone. — billinghurst sDrewth 20:49, 6 December 2023 (UTC)[reply]

IP range block on this article for the person just removing the references and not communicating about their changes. — billinghurst sDrewth 11:33, 27 December 2023 (UTC)[reply]

Move page to "Llama (language model)"

[edit]

All current versions since Llama 2 no longer use the capitalization we currently see for the article title. Adding (language model) is inline with similar articles, Claude (language model), and Gemini (language model).

Should we move to reflect this change? Nuclearelement (talk) 11:45, 13 May 2024 (UTC)[reply]

ranging between 1B and 405B

[edit]

Explain this. 1 banana oder 1 billion. 1 billion what exactly parameters? Kr 17387349L8764 (talk) 09:05, 1 October 2024 (UTC)[reply]

context length is not changed during fine-tuning

[edit]

"Unlike GPT-4 which increased context length during fine-tuning, Llama 2 and Code Llama - Chat have the same context length of 4K tokens."

GPT4 did not increase context length during fine tuning. afaik no LLMs change context length like that. vvarkey (talk) 04:49, 16 October 2024 (UTC)[reply]