The English version of quarkus.io is the official project site. Translated sites are community supported on a best-effort basis.

Quarkus Newsletter #67 - April

Read "Build a Streaming AI Chat in Java with Quarkus, Vaadin, and LangChain4j", a hands-on guest post by Sebastian Kühnau showing how to stream LLM responses token by token in a pure Java UI with Vaadin Flow and Quarkus. Check out Mario Fusco’s post to learn our reasoning behind enabling by default in the Quarkus 3.35 release. Scivics Lab wrote a post "quarkus-chat-ui: A Web Front-End for LLMs, and a Real-World Case for POJO-actor" that introduces a new Chat UI component built with Vaadin and designed to integrate seamlessly into a Quarkus application, simplifying the creation of conversational interfaces. They then follow with another "quarkus-chat-ui (2): The Actor Design Behind LLM-to-LLM Conversation" so you can see how to actually use POJO-actor in quarkus-chat-ui, a Quarkus-based LLM chat UI that connects to Claude Code CLI, vLLM, and other backends. Learn how to integrate Redis caching and data storage in a Quarkus Java application using the Quarkus Redis extension with reactive by reading "How to Use Redis with Quarkus in Java" by Nawaz Dhandala. Markus Eisele wrote "AI Coding Tools in 2026: How to Work With Agents Without Losing Control" a Java engineer’s operating model: Ask/Plan/Code flows, guardrails, and review discipline at scale.

You will also see the latest Quarkus Insights episodes, top tweets/discussions and upcoming Quarkus attended events.

Want to get newsletters in your inbox? Sign up for the newsletter using the on page form.