• Trigger GitHub Actions with enriched deployment data from Vercel

    You can now trigger GitHub Actions workflows in response to Vercel deployment events with enriched data using repository_dispatch events. These events are sent from Vercel to GitHub, enabling more flexible, cost-efficient CI workflows, and easier end-to-end testing for Vercel deployments.

    Previously, we recommended using deployment_status events, but these payloads were limited and required extra parsing or investigation to understand what changed.

    With repository_dispatch, Vercel sends custom JSON payloads with full deployment context—allowing you to reduce Github Actions overhead and streamline your CI pipelines.

    We recommend migrating to repository_dispatch for a better experience. deployment_status events will continue to work for backwards compatibility.

    Avatar for erikarowland-vercelcomAvatar for tknickman

    Erika Rowland, Tom Knickman

  • Llama 4 is now available on Vercel Marketplace

    Meta’s latest and most powerful Llama 4 models are now available through the Vercel Marketplace via Groq.

    To get started for free, install the Groq integration in the Vercel dashboard or add Groq to your existing projects with the Vercel CLI:

    vercel install groq

    You can then use the AI SDK Groq provider with Lama 4:

    import { groq } from '@ai-sdk/groq';
    import { streamText } from 'ai';
    import fs from 'fs'
    const result = streamText({
    model: groq('meta-llama/llama-4-scout-17b-16e-instruct'),
    messages: [
    {
    role: 'user',
    content: [
    { type: 'text', text: 'Describe the image in detail.' },
    { type: 'image', image: fs.readFileSync('./data/llama.png') },
    ],
    },
    ],
    });
    for await (const textPart of result.textStream) {
    process.stdout.write(textPart);
    }

    For a full demo, check out the official Groq chatbot template (which now uses Llama 4) or compare Llama 4 against other models side-by-side on our AI SDK Playground. To learn more, visit our AI documentation.

  • Run and share custom queries in Observability Plus

    Query in Observability - DarkQuery in Observability - Dark

    Observability Plus customers can now create and share custom queries directly from the Observability dashboard—making it easier to investigate specific metrics, routes, and application behavior without writing code.

    The new query interface lets you:

    • Filter by route to focus on specific pages and metrics

    • Use advanced filtering, with auto-complete—no query language needed

    • Analyze charts in the context of routes and projects

    • Share queries instantly via URL or Copy button

    This new querying experience builds on the Monitoring dashboard, helping you stay in context as you drill deeper into your data.

    To try it out, open your Observability dashboard and select Explore query arrows on any chart or the query builder from the ellipsis menu.

    Learn more about running queries in Observability and its available metrics.

    Avatar for jooliashiAvatar for feugyAvatar for timo

    Julia Shi, Damien Simonin Feugas, Timo Lins