Blog

  • LLamaIndex Chat-UI

    LLamaIndex Chat-UI

    Build a chat UI for your LLM app in minutes with LlamaIndex chat-ui! This React component library offers: @vercel AI Key features: @llamaindex/chat-ui is a React component library that provides ready-to-use UI elements for building chat interfaces in LLM (Large Language Model) applications. This package is designed to streamline the development of chat-based user interfaces…

    Continue reading

  • Llama-4

    Llama-4

    Ahmad Al-Dahle shared a glimpse into Meta’s massive AI project—training Llama 4 on a cluster with over 100,000 H100 GPUs! This scale is pushing AI boundaries and advancing both product capabilities and open-source contributions. Great to visit one of our data centers where we’re training Llama 4 models on a cluster bigger than 100K H100’s!…

    Continue reading

  • Meta Launches Quantized Llama Models

    Meta Launches Quantized Llama Models

    Meta has announced a major advancement in AI technology by releasing its first lightweight quantized Llama models. These models, small and efficient enough to run on many popular mobile devices, represent a breakthrough in the field of artificial intelligence (AI), particularly in terms of accessibility and performance. What Makes Quantized Llama Models Stand Out? Meta’s…

    Continue reading

  • Llama 3.2

    Llama 3.2

    The two largest models in the Llama 3.2 collection, the 11B and 90B, are designed for image reasoning tasks such as document-level comprehension, including interpreting charts and graphs, image captioning, and visual grounding tasks like identifying objects in images based on natural language prompts. For instance, someone could ask which month in the previous year…

    Continue reading

  • WordLLama

    WordLLama

    WordLlama is a utility for NLP and word embedding that repurposes components from large language models (LLMs) to generate efficient and compact word representations, similar to GloVe, Word2Vec, or FastText. It starts by extracting the token embedding codebook from a state-of-the-art LLM (e.g., LLaMA3 70B) and trains a small, context-free model within a general-purpose embedding…

    Continue reading

  • How to Run Llama 3.1 Locally

    How to Run Llama 3.1 Locally

    Llama 3.1 is the latest large language model (LLM) developed by Meta AI, following in the footsteps of popular models like ChatGPT. This article will guide you through what Llama 3.1 is, why you might want to use it, how to run it locally on Windows, and some of its potential applications. Let’s dive in…

    Continue reading

  • LLamaIndex Workflows

    LLamaIndex Workflows

    Today, the LLamaIndex Team introduced Workflows—a new event-driven way of building multi-agent applications. By modeling each agent as a component that subscribes to and emits events, you can build complex orchestration in a readable, Pythonic manner that leverages batching, async, and streaming. Limitations of the Graph/Pipeline-Based Approach The path to this innovation wasn’t immediate. Earlier…

    Continue reading

  • LlamaCloud

    LlamaCloud

    Access control over data is a big requirement for any enterprise building LLM applications. LlamaCloud makes it easy to set this up. LlamaCloud lets you natively index ACLs through our data connectors – for instance, we directly load in the user/org-level permissions as metadata in Sharepoint It’s also easy to inject custom metadata through source…

    Continue reading

  • Work with Llama 3.1

    Work with Llama 3.1

    Ready to start working with Llama 3.1? Here are a few new resources from the Llama team to help you get started. First thing’s first, get access to all of the latest Llama 3.1 models at: https://llama.meta.com/llama-downloads/. Want to access the code, new training recipes and more? Check out the official Llama repo on GitHub:…

    Continue reading