.NET MAUI Engineering Team Live Stream: AI-Powered .NET MAUI Development with MauiDevFlow
TL;DR
The .NET MAUI team is now shipping with AI deeply in the loop — Gerald showed their open-source PR dashboard where Copilot-authored PRs are approaching half of merged work, and said MAUI SR6 already shipped with “a lot of Copilot authored” and “Copilot reviewed” code.
MauiDevFlow is the big unlock for agentic MAUI development — instead of just looking at screenshots, the tool injects a debug-only HTTP agent into the app so Copilot can inspect the visual tree, map UI back to code, click elements, and verify fixes inside a running app.
The demo app started from a deliberately opinionated prompt, not magic — Jacob fed Copilot a prebuilt spec for a .NET 10 MAUI app called “Maui Verse,” including the mauiverse.net URL, latest .NET 10 docs, design constraints, no emojis, no Community Toolkit, and a requirement to generate 3 HTML mockups before writing UI code.
The team’s real workflow is ‘spec and steer,’ not one-shot vibe coding — when Copilot grabbed the wrong DevFlow repo and got stuck in build confusion, Gerald and Shane manually redirected it to the official Maui Labs CLI and kept nudging it forward live.
AI is making MAUI development materially faster, even when it looks clumsy on screen — Gerald described building an iOS app, then having Copilot + DevFlow run the Android version, find edge cases, produce a to-do list, and implement fixes, while Shane said he shipped iOS from Windows using GitHub Actions guided by Copilot.
MAUI Labs is becoming the experimental AI toolbox for the ecosystem — beyond DevFlow, the team highlighted the new MAUI CLI, profiling tools, skills marketplace direction, and even a released GTK4 Linux backend now available on NuGet.
The Breakdown
A holiday livestream and a real-world AI side project
Gerald opens by joking that the Netherlands and Denmark are apparently the only places not off on May 1, then quickly pivots into why they’re here anyway: AI finally made his long-postponed conference app practical. He says the speaker/session app they started on an earlier stream has already been polished further and is now in talks with a “big conference,” plus the team plugs MAUI Day on May 29 in Kraków.
Inside the MAUI team’s own AI-assisted engine room
Before they build anything, Gerald pulls up the MAUI PR dashboard and makes the bigger point: this isn’t just a toy demo. Copilot-authored and Copilot-reviewed PRs are climbing fast, merged PR volume is up, and MAUI SR6 already shipped with a lot of AI-generated contributions — with the team consciously “building that confidence muscle” to trust the workflow.
From “vibe coding” to ‘agentic engineering’
There’s a funny but revealing detour on terminology: Gerald says “vibe coding” undersells what they’re doing, and Jacob offers “agentic engineering” as the more respectable label. The team frames the shift as normal now — not hiding AI usage, but proudly saying, essentially, “look at this thing, it works, I wrote zero code for it.”
Jacob’s prompt: strict, specific, and built to avoid AI slop
Jacob starts a Copilot session in YOLO mode and walks through the prompt he prepared beforehand. It tells Copilot to build a .NET MAUI mobile companion for mauiverse.net, points it to the latest .NET 10 MAUI docs because many models still don’t know .NET 10, tells it to use Maui skills plus DevFlow, bans emojis, skips unit tests for the stream, and asks for 3 HTML mockups before writing the actual app so he can review UI direction like a lightweight Figma step.
The first visible win: AI-generated mockups and live design choices
Copilot crawls mauiverse.net, discovers sections like “Mission Control” and “Cosmic Stream,” and generates three HTML mockups on the fly. The team reacts like real developers, not magicians: they roast the middle one for being too busy, debate left versus right, and settle on the first design as the safest and cleanest direction.
Why DevFlow matters more than screenshots
Gerald then explains MauiDevFlow, now in the Maui Labs repo, as the hero of the day. It injects a small debug-only HTTP server into the app so the agent can inspect the actual visual tree, take screenshots, click around, and connect what’s on-screen to the right place in code — a huge step up from blind screenshot-based automation.
The messy truth: AI got confused, and the humans had to rescue it
The most useful part of the stream is probably the failure. Copilot pulls the wrong DevFlow implementation from an older repo, starts probing the wrong CLI, gets stuck on builds, and Shane and Gerald catch it live, steering it back to the official Maui Labs tooling and reminding viewers this is why explicit instructions, skills, and setup matter.
Bigger picture: Linux, skills, evals, and the ‘spec and steer’ philosophy
While the agent churns, the team answers chat: yes, there’s now a GTK4 Linux backend in Maui Labs; yes, you can use skills, evals, and even multi-agent/fleet workflows; yes, performance constraints can be fed into the loop. Shane sums up the philosophy as “spec and steer,” saying the real gains come after investing time upfront to teach your repo, your tools, and your agents how to work together.
The app finally appears — tabs, real data, and a likely part two
Near the end, the payoff arrives: the app boots with tabs, a cosmic theme, and real-profile data from people like David Ortinau. DevFlow scrolls the app on its own to verify behavior, everyone visibly relaxes, and the team decides this is enough of a baseline to continue next time rather than force a fake polished ending.