Implementing an open source LLM that runs on your machine, that you can even access in offline mode! This uses Meta's OPT model, a 175-billion-parameter that rivals GPT-3 in performance. Also interestingly, we'll use the instruction-tuned version of it, known as the OPT-IML.
Mentioned in the video:
- OPT (Open Pre-trained Transformers): https://huggingface.co/docs/transform...
- OPT Paper: https://arxiv.org/pdf/2205.01068.pdf
- FINETUNED LANGUAGE MODELS ARE ZERO-SHOT
LEARNERS paper: https://arxiv.org/pdf/2109.01652.pdf
- One or two slides are from 陳縕儂 @VivianMiuLab of National Taiwan University CSIE
- OPT with instruction tuned: https://huggingface.co/facebook/opt-i...
- Watch PART 1 of the LangChain / LLM series: • LangChain + OpenAI tutorial: Building...
- Watch PART 2 of the LangChain / LLM series:
• LangChain + OpenAI to chat w/ (query)...
- Watch PART 3 of the LangChain / LLM series
LangChain + HuggingFace's Inference API (no OpenAI credits required!)
• LangChain + HuggingFace's Inference A...
- Watch PART 4 of the LangChain / LLM series
Understanding Embeddings in LLMs (ft LlamadIndex + Chroma db)
• Understanding Embeddings in LLMs (ft ...
- Watch PART 5 of the LangChain / LLM series
Query any website with GPT3 and LlamaIndex
• GPT scrapes + answers from any sites ...
All the code for the LLM (large language models) series featuring GPT-3, ChatGPT, LangChain, LlamaIndex and more are on my github repository so go and ⭐ star or 🍴 fork it. Happy Coding!
https://github.com/onlyphantom/llm-py...