LiveVue 1.0
Examples Phoenix Streams

Phoenix Streams

Efficiently manage large or dynamic collections with Phoenix streams. Perfect for chat messages, feeds, and real-time lists.

What this example shows

1
Phoenix Streams
stream() and stream_insert()
2
Transparent Arrays
Streams become arrays in Vue
3
Async Processing
start_async for background work
streams_live.ex
defmodule MyAppWeb.StreamsLive do
  use MyAppWeb, :live_view

  @ai_responses [
    "That's an interesting point! I'd love to hear more about your thoughts.",
    "I see what you mean. Have you considered looking at it from a different angle?",
    "Great question! Let me think about that for a moment...",
    "Thanks for sharing! This reminds me of something I read recently.",
    "I appreciate your curiosity. The answer might surprise you!"
  ]

  def render(assigns) do
    ~H"""
    <.vue
      messages={@streams.messages}
      isThinking={@is_thinking}
      v-component="Streams"
      v-socket={@socket}
    />
    """
  end

  def mount(_params, _session, socket) do
    welcome = %{
      id: Ecto.UUID.generate(),
      role: "assistant",
      content: "Hi! I'm a demo AI. Send me a message and I'll respond."
    }

    {:ok,
     socket
     |> assign(is_thinking: false)
     |> stream(:messages, [welcome])}
  end

  def handle_event("send_message", %{"content" => content}, socket) do
    content = String.trim(content)

    if content == "" do
      {:noreply, socket}
    else
      user_message = %{id: Ecto.UUID.generate(), role: "user", content: content}

      socket =
        socket
        |> stream_insert(:messages, user_message)
        |> assign(is_thinking: true)
        |> start_async(:ai_response, fn ->
          Process.sleep(Enum.random(800..1500))
          Enum.random(@ai_responses)
        end)

      {:noreply, socket}
    end
  end

  def handle_async(:ai_response, {:ok, response}, socket) do
    ai_message = %{id: Ecto.UUID.generate(), role: "assistant", content: response}

    {:noreply,
     socket
     |> stream_insert(:messages, ai_message)
     |> assign(is_thinking: false)}
  end

  def handle_async(:ai_response, {:exit, _reason}, socket) do
    {:noreply, assign(socket, is_thinking: false)}
  end
end

How it works

1 Streams are transparently converted to arrays

When you pass @streams.messages to a Vue component, LiveVue automatically converts it to a regular JavaScript array. All the memory and performance benefits of streams still apply on the server side.

<.vue messages={@streams.messages} v-component="Chat" v-socket={@socket} />

2 Initialize and update streams in LiveView

Use stream/3 in mount to initialize, and stream_insert/3 to add items. Only the diff is sent over the wire, making updates very efficient.

# Initialize
|> stream(:messages, [])

# Insert
|> stream_insert(:messages, new_message)

3 Use start_async for background work

When processing takes time (like AI responses), use start_async/3 to run work in the background. This keeps the UI responsive and allows props to update during processing.

socket
|> assign(is_thinking: true)
|> start_async(:ai_response, fn ->
  # Background work here
  generate_response()
end)

4 Handle async results

The handle_async/3 callback receives the result when the background task completes. Update your assigns here to reflect the new state.

def handle_async(:ai_response, {:ok, response}, socket) do
  {:noreply,
    socket
    |> stream_insert(:messages, response)
    |> assign(is_thinking: false)}
end
Next up: Connection Status
View example →