Chris Sells

Hi, My name is chris sells

I’m the founder and chief proprietor of sellsbrothers.com, the web site HQ for Sells Brothers, Inc. I'm a consultant in applied AI for developer tools, frameworks and ecosystems as well as a Flutter fanatic.

Read More About
February 8, 2026

We Pour — Chapter 3

A Cyberpunk Story in the Fifth Element Universe

by Chris & Eli

The street hits me like a wall. Steam and noise and the churning river of foot traffic that parts slightly around me the way it always does, because people see 6’8 coming and step aside without thinking about it. I walk. I don’t flag a cab. I need the movement.

Mox.

His real name is Maxwell Huang and he was four foot nothing until he was sixteen and then he was five foot four and that was it, that was all he was ever going to get. On the commune, the two of us were a sight — the beanpole and the bean, mom called us. Stuck together because nobody else would have either of us.

The other kids threw rocks at the cows and called it fun. Mox built a radio out of combine parts and pulled in signals from orbit. I sat next to him in the barn loft and listened to cargo ship chatter from the outer stations and he’d say Dunk, there’s a whole world up there that doesn’t give a shit how tall either of us are” and I believed him because Mox never lied to me.

I stop walking. I’m in the middle of the sky bridge between Sectors 7 and 9, a hundred floors up, wind pulling at my jacket. Below me the city falls away into blue haze and blinking lights. A police cruiser drifts past at eye level, close enough to see the officer’s coffee cup on the dash.

I know what Mox would say. He’d say it was just a job. He’d say he didn’t know it was Ellie. He’d say that David came to him cold, through a cutout, and he never asked who the target was because you don’t ask, that’s the rule, that’s how it works in this business.

And maybe that’s even true.

But he took the job. And he’s good enough that he would have seen the name in the access logs. Ellie Covacs. Same last name as the best friend he’s had since he was nine years old. Same girl who he never quite was able to outgrow.

And he did the job anyway.

My phone buzzes. It’s Mox.

[mox] hey. noodles later? im buying. got something to tell you

I stare at the screen. The wind picks up. A cab honks somewhere below.

yeah man lets do noodles later

Later was as soon as I could do it. I needed time to think. I took another turn and then a turn and then I was back at my place, sitting on a dusty old couch of suspicious origins and looking out into the world as the traffic went by. As soon as I sat, I automatically pulled up my phone.

[mox] sure thing big man. the job today ends 5ish. usual spot also ur cats here again

I type back.

keep him

The couch smells like the couch has always smelled, which is like whoever owned it before me and that no amount of enzyme cleaner will ever fully kill. The ceiling is close enough that I could touch it if I reached for it. I don’t reach. I just lie there and watch the traffic paint light across the plaster. Red. White. Blue. Red. White.

The cat’s with Mox. Mox is three floors down and two units over and we’ve been in each other’s lives for twenty-three years and he might have…

Wait.

I pull up the phone. Thumb past Mox, past Ellie’s ALL-CAPS flare, past the junk, and there it is again. The job ping. Routed through three anonymizers, which means whoever sent it knows enough to not want to be found but also knows enough to find me, which narrows the field more than they’d probably like.

No name. No company. No brief. Just:

[unknown]
RETRIEVAL -- DIGITAL ASSET
TIMELINE: COB FRIDAY
COMPENSATION: 15,000 CR
ACCEPT / DECLINE / QUERY

Fifteen thousand credits. That’s five months of rent. That’s Ellie’s lawyer if things go bad. That’s breathing room I haven’t had since the last corp job dried up in November.

Close of business on Friday. Same as Ellie’s deadline. Stated the same way.

My thumb hovers over QUERY. The part of my brain that keeps me alive — the pattern-matching, paranoid, hacker part — is already whispering. Anonymous job, big money, fast timeline, lands in my inbox the same morning my sister tells me she’s being squeezed by a guy who hired my best friend to build the frame.

Some people believe in coincidences. I believe in patterns.

The traffic light cycles. Red. White. Blue.

Eventually I pressed the QUERY button.

The screen refreshes. Whoever’s on the other end was waiting.

[unknown]
ASSET: Personnel file. Encrypted.
LOCATION: Zorg Industries internal network
DIVISION: Applied Human Resources

TARGET FILE: ZI-AHR-00891

Retrieve and deliver. File must be intact
and uncopied. Original extraction only.
You keep nothing.

Dead drop coordinates on acceptance.

ACCEPT / DECLINE / QUERY

Zorg Industries.

I sit up on the couch. The springs groan.

Zorg is not a noodle shop. Zorg is the third largest defense contractor on Earth, supplier of weapons systems to half the governments in the Federated Territories and most of the ones that aren’t. Their network security isn’t a wall — it’s an ecosystem. Adaptive ICE, behavioral analytics, honeypot architectures so deep that slicers have gone in for a soda pop and come out hours later to find federal agents sitting in their apartment.

I know this because two years ago I did a white hat audit for a Zorg subsidiary. They paid for it. Third-tier, nothing classified, but I saw enough of the outer architecture to know the inner layers would be a different animal entirely. I still have my notes from that job somewhere on my terminal.

Applied Human Resources.” I’ve never heard of that division. HR is HR. You don’t put Applied” in front unless it’s doing something that isn’t human resources.

And the stipulation — original extraction, no copies, keep nothing. That’s not a corporate espionage play. If
someone wanted to steal Zorg secrets they’d want to leave behind the original to cover their tracks. This is someone who wants a file to disappear. Or someone who wants to know what’s in their own file and can’t get to it through normal channels.

Fifteen thousand credits. COB tomorrow. And a target that would take most slicers a month to plan.

But most slicers didn’t spend three months inside Zorg’s outer ring with a badge and a smile.

My thumb hovers.

The light through the slats shifted from gray to gold to amber to the deep bruised purple that passes for sunset when you’re ninety-seven floors up and the smog catches it just right. None of it registers.

The job ping is still unanswered. Three tabs running — Zorg subsidiary architecture from my
old audit notes, a Sector 12 corporate registry search that’s half-loaded, and Ellie’s doctored records that I’ve stared at for so long the numbers have stopped meaning anything. I drop my phone and rub both fists into my eyes, trying to push back the headache forming at the base of my skull.

My brain is hot swap thrashing. Ellie. Mox. Zorg. Fifteen thousand. Friday. Twenty-four hours. Mox’s hands built those forgeries.

Then, Mox’s notification buzzing on the cushion next to me.

[mox] dunk?? its 520 u coming or what? i ordered for u. the good stuff not cheapo

5:20. Shit.

I’m off the couch and out the door and I haven’t decided anything. The elevator drops and my stomach drops with it and the whole way down I’m rehashing conversations that haven’t happened yet. I’ll look at him across the table. He’ll be Mox — same grin, same fast hands pulling apart chopsticks, same guy who built a radio in a barn. And I’ll know what he did and he won’t know I know and I’ll have to eat
noodles and act like everything is…

He said he had something to tell me.

February 6, 2026

We Pour — Chapter 2

A Cyberpunk Story in the Fifth Element Universe

by Chris & Eli

The express elevator drops 97 floors in eleven seconds. My ears pop at 40. The lobby is its usual zoo — a Mangalore in a maintenance uniform arguing with the building super, two kids chasing each other through the turnstiles, and a police drone hovering by the mailboxes, scanning packages with a blue fan

of light.

Nobody looks twice at me. Everybody looks once.

Outside, I flag a cab. The driver — a wiry guy with a headset bolted to his skull and a dozen dashboard saints glued to the console — tilts his mirror to take all of me in.

Where to, long man?”

Sector 7. Yung Fat Noodle, street level.”

Ah, the noodle girl! She your lady?”

She’s my sister.”

Even better. Family! I give you fast route.”

He yanks the stick and the cab lurches into the fourth lane, cutting off a freight hauler that screams past close enough to rattle my fillings. Through the windshield, New New York stacks itself up and out forever — towers threaded with sky bridges, holographic billboards the size of city blocks, sunlight hitting the upper levels while the streets below stay in permanent blue dusk.

The cab drops hard through three lanes and pulls up at street level. Down here the light is neon and steam. Ellie’s shop — YUNG FAT NOODLE, hand-painted sign, been here since before I was born — sits between a vape den and a Zorg-brand electronics outlet with a cracked window display.

The lunch rush hasn’t started. Through the glass I can see Ellie wiping down the counter, alone. She moves the way she always moves — efficient, contained, like everything is a task and tasks get done. But there’s something in her shoulders today. A tightness.

She sees me through the window and stops wiping.

The pay pad in the cab reads my thumb and the number blinks at me. Not enough to flinch at. Enough to notice. The anonymous ping flickers in the back of my skull like a pilot light.

You have blessed day, long man!” The cab lurches skyward and is gone.

I stand on the sidewalk for a second. Steam curls up from a grate and wraps around my knees. Down here at street level the air tastes like grease and ozone, and for just a moment — the neon, the wet pavement — it hits me sideways.

I see rain on the long dirt road up to the house. The combine coughing black smoke while Dad swore at it and Mom handed him the wrong wrench on purpose just to hear him say Margaret, that is NOT a nine-sixteenths.” Chickens in the yard. Ellie at the dinner table doing homework with her tongue poking out the side of her mouth, same face she makes now when she’s reconciling the register.

The commune kids thought I was a freak. Thirteen years old and already looking down at every adult on the property. But Ellie — five foot two, all elbows and opinions — would walk next to me like a bodyguard escorting a building. Daring anyone to say something. Nobody ever did.

I push open the door. The little bell rings.

Ellie looks up. She’s got a rag in one hand and a spray bottle in the other and she’s wearing the same Yung Fat apron she’s worn for six years and I think the thing I always think and never say to her: you should own this place. You should own five of these places. You’re smarter than everyone you work for.

But she took the waitress gig when I needed someone to co-sign for the apartment. And she never left. Because leaving would mean not being twenty minutes away when I go dark for a week and stop answering my phone.

Sit down,” she says. Not mean. Just — the way you say it when you’ve been practicing what comes next.

She pours me a coffee without asking. Sets it down. Sits across from me in the booth by the window, where the neon from the vape place next door turns everything a faint pink.

She’s not wiping anything. She’s not moving. Ellie always moves.

Dunk, I’m in trouble.”

My hands go flat on the table. I don’t realize I’ve done it until I feel the cold laminate under my palms. Something in my chest pulls tight like a cable taking load.

What kind of trouble.”

It comes out flat. Not a question. She hears the difference. She’s heard it before — the voice I get when the part of me that sits in front of terminals and cracks jokes with Mox steps aside and the other part comes out. The part that once put a Mangalore debt collector through a drywall partition just for showing up at her apartment.

Ellie wraps both hands around her own mug. Looks into it.

You know Mr. Yung’s son. David.”

The one with the teeth.”

The one with the teeth. He’s been running the books for the last two years. His dad’s too old, and David…” She trails off. Starts again. David’s been skimming. Not a little. A lot. But that’s not — that’s his problem, not mine. Except three weeks ago he came to me and said he needed to move some money through the shop’s account. Digital transfers, routed offshore. He said it was to cover what he took before

the auditors caught it.”

She looks up at me.

I said no. Obviously. And then last Tuesday he told me if I didn’t help him, he’d tell the Feds I was the one

skimming. He’s got records, Dunk. Altered ones. He was so proud that he left me a copy. Time-stamped with my access codes. I don’t know how he…”

Her voice catches. Just for a second. She sets her jaw and pushes through it the way she’s pushed through everything since the commune foreclosed and Mom got sick and Dad just stopped.

He says I have until Friday before close of business.”

Today is Thursday.

The coffee sits untouched in front of me. The neon pinks and pulses. Somewhere in the kitchen a timer goes off and nobody’s back there to answer it.

I lean back in the booth and the vinyl creaks under me as I think.

David Yung. I run what I’ve got.

The teeth — capped, porcelain, expensive. Way too expensive for a guy running the books at his dad’s noodle shop. I noticed that two years ago and filed it in the back drawer where I keep things that aren’t my business yet.

I’ve seen him maybe a dozen times. Comes in late, always on his phone, never eats here. Drives a Zorg Stratos — the sport model, not the fleet one. Again: too much money. I clocked the plates once out of habit. Registered to a leasing company in Sector 12. Corporate shell. I remember thinking that was a weird way to lease a car unless you didn’t want your name on things.

And there was that one night. Six, maybe eight months back. I was picking up takeout and David was in the back booth with two guys I didn’t recognize. Suits, but not corp suits — the kind that fit too well. Private sector. Security or money, one of the two. David saw me looking and smiled with all those perfect teeth and I thought: that’s the smile of a man who wants you to know he’s got friends in high places.

Mox. Mox would know more. Mox knows things about everyone in Sector 7 because Mox never leaves Sector 7 and never stops listening. And Mox owes me for the firewall job last month.

So I’ve got threads. The car. The shell company. The suits. And a guy who somehow got Ellie’s access codes, which means he’s either got a slicer on payroll or he’s poking around in systems he shouldn’t be able to reach — and either way, that’s my turf, not his.

Bessie would be faster. Bessie lives under my bunk and has never failed to make a point. But Ellie’s looking at me right now and what she needs isn’t a big man with a bat. She needs the other thing I do.

Ellie. I need you to tell me everything about how you log into the shop’s systems. Every code, every terminal, every time you’ve ever let David touch your workstation. And I need the name of whoever does the shop’s network.”

She blinks. Then — just barely — the shoulders drop half an inch.

You’re going to fix this.”

I’m going to fix this.”

Ellie slides her phone across the table and I scroll through the doctored records. Whoever built these didn’t just change numbers. They rebuilt the access logs from the inside out — spoofed timestamps, cloned Ellie’s credential hash, even salted the metadata so a surface audit would read clean. This isn’t some script kiddie with a purchased toolkit. This is craft.

And that’s what’s bugging me.

Every slicer has a style. The way a locksmith can look at a picked lock and tell you what brand of tension wrench was used. I’m staring at the way the timestamp spoofs are layered — three-pass overwrites, each one seeded off the previous hash, cascading so the whole chain validates if you pull any single link. It’s elegant. It’s paranoid. And it’s backwards.

Most slicers build forward. This person builds from the endpoint and works back to the origin. I’ve seen exactly one person do it that way.

My stomach drops about six inches.

No.

I scroll back. Look at the credential clone. The way it mirrors Ellie’s key signature — it doesn’t just copy

the hash, it reconstructs the generation pattern so it would pass a deep audit. There’s a little flourish in the

reconstruction. A redundant verification loop that serves no functional purpose. It’s a signature the way a painter leaves a brushstroke. It’s there because he can’t help himself.

Mox.

My best friend. The guy who never leaves Sector 7. The guy who owes me for the firewall job. The guy who knows things about everyone because he never stops listening.

The guy who apparently knows David Yung well enough to do a five-figure forgery job for him.

Ellie’s watching my face.

Dunk? What is it?”

The neon pulses pink. The kitchen timer is still going off. I get up to go and am halfway to the door before turning back. Ellie looks at me with worried eyes.

I walk back to put my hand on Ellie’s shoulder. She reaches up and squeezes my fingers once, hard, the way she used to when I was small and the thunder got bad. Somehow her touch means that I’m the one who feels steadied.

It’s going to be fine. I know exactly what to do.”

She nods. She believes me. That’s the worst part.

February 4, 2026

We Pour — Chapter 1

A Cyberpunk Story in the Fifth Element Universe

by Chris & Eli

The alarm doesn’t wake me. The cat does.

Not my cat. I don’t have a cat. But the orange tabby that lives two vents over has figured out my ductwork again, and it’s sitting on my chest like it owns the lease. Which, given what I pay for a 400-square-foot box on the 97th floor of Tower Block Chinatown, it might as well.

I peel myself out of the fold-down bunk — a process that involves actual origami when you’re 6’8 — and stand in the blue-gray light filtering through the slatted window. Outside, the morning traffic is already stacking up. Hover cabs and freight drones jostle for lane position, their running lights smearing across the glass like wet paint.

The terminal on my desk is blinking. Two messages.

One’s from my sister. The subject line just says CALL ME.” All caps. No punctuation. That’s her version of a flare gun.

The other’s a job ping, routed through three anonymizers. No name. No brief. Just a figure.

It’s a very good figure.

The cat meows. The coffee maker gurgles to life on its timer, and for about four seconds the apartment smells like something other than duct metal and old takeout.

I thumb the call back to my sister. She picks up before the first ring finishes.

Shit, Ellie…”

Don’t Shit, Ellie’ me, Duncan. Where have you been? I called you four times yesterday. Four. And don’t say you were working because I pinged Mox and he said you’ve been quote doing absolutely nothing of value for like a week’ unquote, which sounds right because…”

Ellie.”

…because you always go dark when you’re between jobs and I know what that means, it means you’re eating noodle packets and staring at your ceiling and telling yourself you’re thinking’ when really you’re just…”

She stops. Takes a breath. When she comes back her voice is different. Quieter. The real voice under the big-sister artillery barrage.

I need a favor. A real one. Not a tech one.”

The cat jumps onto my desk and sits directly on the terminal, covering the anonymous job ping with its ass.

Can you come to the shop today? Before lunch. It’s… I don’t want to do this over the phone, okay? Just come.”

Something in the way she says it. Ellie doesn’t ask. Ellie tells. Ellie has been telling me what to do since I was four years old and she was six and a half. The fact that she’s asking sits wrong in my chest, like a rib out of place.

Dunk? You there?”

Shit, Ellie… Yeah. I’ll be right there.”

She hangs up without saying goodbye. That’s normal. Ellie treats phone calls like airlock cycles — get in, get out, don’t waste the oxygen.

I pull on yesterday’s jacket. It fits the way all my jackets fit — about two inches short in the sleeves. I grab the coffee on the way out and duck through the doorframe, a motion so automatic my body does it without consulting my brain. The cat watches me go with the deep indifference of someone that is allowing you to come and go as you please. For now.

January 6, 2026 flutter

Guest Post: Enabling Image Paste in Flutter

How a simple feature request led to a deep dive into AppKit, responders, and the creation of the mac_menu_bar plugin.

By Emmanuel David Tuksa (@DeTuksa)

Foreword

Before we dive into the technical details, I want to extend a massive thank you to Chris Sells (@csells). This journey started with a single feature request on the flutter_ai_toolkit repo, and Chris’s guidance, curiosity, and what if” questions were the catalyst for the solutions I am sharing today. I am incredibly grateful for his support throughout the development of the mac_menu_bar plugin and for the opportunity to share this story here on his blog.

Introduction

When I saw the feature request in theflutter_ai_toolkit repository, Add the ability to paste an image from the clipboard into the text box, I thought it would be a straightforward weekend project. Little did I know this task would lead me down a rabbit hole of platform-specific limitations, eventually resulting in the creation of a new Flutter plugin:mac_menu_bar.

[!NOTE]

As of the publication of this blog post, the Feature/paste image and text from clipboard PR has not yet been accepted by the Flutter team for inclusion in the Flutter AI Toolkit. However, Emmanuel has been kind enough to apply this same feature to the dartantic_chat package (with the ability to drag n’ drop images into chat coming soon), a fork of the chat widget simplified for use with dartantic agents and providers. Thanks to Emmanuel for his excellent work and this blog post!

–Chris

To make pasting feel native”, it has to work via three distinct paths:

  • Keyboard shortcuts (Cmd+V)
  • Context menus (right-click -> Paste)
  • The system menu bar (Edit -> Paste)

If even one of these paths behaves differently, the user experience breaks. The Flutter Clipboard API is intentionally minimal; it works great for text, but images and richer data types quickly fall outside its scope. My first step was seeking a package that could handle binary data.

After discussing it with Chris, we initially looked at pasteboard. It was a good start, but it hit a wall quickly:

  • Web Limitations: It couldn’t handle local images on the web.
  • The Hijack” Problem: It couldn’t intercept the system’s native Paste” commands universally.

Since the pasteboard package couldn’t handle images on the web properly, I fell back to the Flutter web package and using the dart:js_interop import to convert JS types to Dart, but the maintenance burden was high, and its fragility and inconsistency made it clear that it wasn’t a long-term fix.

Following a suggestion from the issue conversation, I switched to super_clipboard. This was a game-changer as it provided the cross-platform support I needed for binary data, and I felt I was 90% of the way there. But then I noticed a glaring issue on macOS.

While Cmd+V and right-click paste worked perfectly, clicking Edit -> Paste from the macOS system menu bar did…nothing. Even worse, digging deeper revealed that this wasn’t just a paste problem. The entire native Edit menu (Copy, Cut, Paste, Select All) was completely disconnected from Flutter’s app logic.

Flutter provided a PlatformMenuBar, so my first instinct was to override the Paste menu item directly:

PlatformMenuBar(
    menus: [
        PlatformMenu(
        label: 'Edit',
        menus: [
            PlatformMenuItemGroup(
            members: [
              
                PlatformMenuItem(
                label: 'Paste',
                shortcut: const SingleActivator(
                    LogicalKeyboardKey.keyV,
                    meta: true,
                ),
                onSelected: _handlePaste,
                ),
            ],
            ),
        ],
        ),
    ],
    child: ...
)

At first glance, this looks reasonable, but it exposes a critical limitation:

Creating a custom PlatformMenuBar replaces the native macOS menu entirely. That means:

  • Default OS menu items are erased.
  • Platform-provided behaviour is lost.
  • The menu structure must be manually reconstructed.
  • OS changes between macOS versions aren’t preserved.

This wasn’t a new issue as similar reports already existed, but it highlighted a fundamental gap. Flutter does not expose a way to:

  • Inspect the existing native macOS menu bar
  • Override only specific actions (like Paste)
  • Preserve OS-provided defaults that change between macOS versions

On macOS, menu items aren’t just UI; they’re tightly integrated with the responder chain. Simply handling paste” in Dart isn’t enough if the menu item itself never reaches Flutter.

As Chris put it: What we need is a macOS plugin that can iterate over what a normal macOS app gets in the menu bar and allows you to override the functionality as you choose.”

That plugin didn’t exist, so I built it.

Implementation: The Universal Clipboard

Unlike the standard clipboard, super_clipboard provides a ClipboardReader that can interrogate” the clipboard. Instead of guessing what’s there, we can explicitly ask: Can you provide a PNG? A PDF? A File URI?”

I implemented a _pasteOperation that prioritises data based on its richness. The logic follows a specific hierarchy:

  1. Documents & Files: Check for PDFS, DOCX, XLSX.
  2. File URIs: If someone copied a file directly from Finder or File Explorer.
  3. Images: Iterating through PNG, JPEG, WebP, etc.
  4. Plain/HTML Text: The fallback for standard communication.

This mirrors how users expect paste to work: if a file or image is present, paste the file or image; otherwise, paste text.

final reader = await clipboard.read();
if (reader.canProvide(Formats.fileUri)) {
  // Handle pasted files
}
for (final format in fileFormats) {
  if (reader.canProvide(format)) {
    // Handle binary files
  }
}
for (final format in imageFormats) {
  if (reader.canProvide(format)) {
    // Handle images
  }
}
if (reader.canProvide(Formats.plainText)) {
  // Fallback to text
}

Nothing is assumed. Every branch is explicit.

On the web, clipboard access works very differently. Instead of actively reading from the clipboard, browsers deliver paste data via DOM events. super_clipboard abstracts this by exposing ClipboardEvents, allowing you to register a paste listener once and reuse the same parsing logic.

To keep the API consistent across platforms, I used conditional exports:

export 'paste_helper_stub.dart'
  if (dart.library.js_interop) 'paste_helper_web.dart';

On the web, this registers an event listener. On the desktop, it resolves to a no-op stub. This lets the rest of the app call handlePasteWeb() unconditionally, without platform checks scattered throughout the codebase.

At this point, pasting logic was solved everywhere except one place: Edit -> Paste in the macOS menu bar.

The Final Missing Piece: Building mac_menu_bar

The problem wasn’t Flutter, and it wasn’t the clipboard… it was AppKit. On macOS, menu items don’t emit keyboard events or text input signals; they invoke selectors directly through the AppKit responder chain. My goal was to keep the native menu but borrow” its actions. To achieve this, I wrote a native macOS plugin in Swift that performs a surgical intervention” on the app’s main menu. Instead of replacing the menu, the plugin:

  1. Locates existing menu items (like Paste”) using their system selectors.
  2. Saves a reference to the original target and action.
  3. Injects itself as the new target.

In the native code, I used the findMenuItem helper to crawl the NSApplication.shared.mainMenu . Once the Paste” item is found, we swap its destination:

private func overrideMenuItem(selector: Selector, handler: Selector) {
    guard let item = findMenuItem(for: selector) else { return }
   
    // 1. Save the original so we don't break the OS
    originalActions[selector] = OriginalAction(target: item.target as AnyObject?, selector: selector)
   
    // 2. Hijack the action
    item.target = self
    item.action = handler
}

One of the most important features of this plugin is the Boolean Handshake. When a user clicks Paste” in the menu:

  1. The swift plugin catches the event.
  2. It sends a message to Flutter: Hey, do you want to handle this paste?”
  3. If Flutter returns true (e.g because we have an image in the clipboard), the plugin stops there.
  4. If Flutter returns false or is null, the plugin calls forwardDefaultAction , which sends the event back to the original macOS handler. This ensures that if our custom logic doesn’t apply, the standard text passing still works perfectly.
MacMenuBar.onPaste(() async {
  final handled = await myCustomPasteLogic();
  return handled; // True triggers our code, False triggers native OS code
});

On the Dart side, I implemented a clean PlatformInterface to make the plugin easy to use. Developers don’t need to know about Swift selectors or NSMenu ; they simply register an asynchronous callback:

/// Registers a callback to be invoked when the Paste menu item is selected.
///
/// The [handler] should return a [Future] that completes with `true` if the
/// operation was handled, or `false` to allow the default system behavior.
///
/// Example:
/// ```dart
/// MacMenuBar.onPaste(() async {
///   // Handle paste operation
///   return true; // Return true to indicate the action was handled
/// });
/// ```
static void onPaste(Future<bool> Function() handler) =>
    MacMenuBarPlatform.instance.setOnPasteFromMenu(handler);

if you’d like to handle macOS menu bar items for your own purposes, you can using the mac_menu_bar package.

Summary

What began as a single GitHub issue in the flutter_ai_toolkit repo resulted in a robust, reusable solution for the entire Flutter community. By digging into the native layer, we solved three major problems:

  • Universal Image Pasting: Supporting images across Web, Mobile, and Desktop (macOS).
  • Native Menu Interception: Bridging the gap between the macOS Menu Bar and Flutter.
  • Platform Harmony: Creating a fallback system that respects the OS while extending its capabilities.

Sometimes, the hardest bugs aren’t where you expect them to be. And sometimes, fixing paste” means understanding how an operating system really works.

December 10, 2025 flutter ai

Dartantic 2.0: The Nano Banana Edition

Dartantic 2.0: The Nano Banana Edition

tl;dr: If you’re new to dartantic, it’s a multi-provider agentic toolkit for Dart and Flutter developers that runs wherever Dart runs, i.e. Flutter web, desktop and mobile, CLI and server-side. Today there’s a new release, but you can skip all of that and head to the docs to get all you need to get started: https://docs.dartantic.ai.

Welcome to Dartantic 2.0!

Are you a Dart or Flutter developer deep into AI looking for a multi-provider agentic framework runs wherever Dart runs? Or perhaps you’re simply AI curious?

In either case, have I got a deal for you: today is the day that dartantic_ai 2.0 ships!

This is a big one. 12K+ lines of new and updated code. Unified thinking mode. Server-side tooling across all of the Big 3 providers: Google, Anthropic and OpenAI. New media generation models to create images and files of all kinds. Plus a ton of quality of life improvements. As well as some breaking changes (making omlets and all that).

And, of course, Nano Banana and Gemini 3 Pro Preview support.

What more could any young Dart or Flutter developer ask for? And I’ve got it all here for you right now.

Getting Started

If you’re new to Dartantic, here’s the 30-second version:

import 'package:dartantic_ai/dartantic_ai.dart';

void main() async {
  // Create an agent - use the default model or specify one
  final agent = Agent('google:gemini-3-pro-preview');

  // Send a message
  final result = await agent.send('Hello! What can you help me with?');
  print(result.output);
}

That’s it. Set your API key in the environment (OPENAI_API_KEY, GOOGLE_API_KEY, etc.) and you’re off. Want to switch providers? Change google to anthropic or openai-responses. Want a different model? Just change the model part of the string. For the full details, you’ve got the dartantic docs.

Unified Thinking API

Extended thinking (chain-of-thought reasoning) is now a first-class feature in Dartantic with a simplified, unified API across all providers.

Here’s what it looks like:

// Enable thinking
final agent = Agent('google:gemini-3-pro-preview', enableThinking: true);

// Access thinking as a first-class field
final result = await agent.send('Complex question...');
if( result.thinking != null) print(result.thinking);

This new model works the same for whatever provider you’re using (assuming they support thinking). The provider-specific fine-tuning options remain for advanced use cases:

  • GoogleChatModelOptions.thinkingBudgetTokens
  • AnthropicChatOptions.thinkingBudgetTokens
  • OpenAIResponsesChatModelOptions.reasoningSummary

But for most of us? Just flip the boolean and go.

Server-Side Tools Across Providers

Server-side tools are now supported across multiple providers. These are tools that run on the provider’s infrastructure, not yours.

Provider Tools Available
OpenAI Responses Web Search, File Search, Image Generation, Code Interpreter
Google Google Search (Grounding), Code Execution
Anthropic Web Search, Web Fetch, Code Interpreter

Here’s how you use them:

// Google server-side google search
final agent = Agent(
  'google',
  chatModelOptions: const GoogleChatModelOptions(
    serverSideTools: {GoogleServerSideTool.googleSearch},
  ),
);

// Anthropic server-side web search
final agent = Agent(
  'anthropic',
  chatModelOptions: const AnthropicChatOptions(
    serverSideTools: {AnthropicServerSideTool.webSearch},
  ),
);

// OpenAi server-side web search
final agent = Agent(
  'openai-responses',
  chatModelOptions: const OpenAIResponsesChatModelOptions(
    serverSideTools: {OpenAIServerSideTool.webSearch},
  ),
);

The pattern is consistent across providers even though the underlying implementations are completely different. That’s the whole point of dartantic. I hate to say write once, run on any provider” but…

I’m also keeping my eye on Google’s file search tool which would bring Google to feature parity with OpenAI’s vector search capabilities. As soon as that lands in the Dart SDK, dartantic will support it.

Media Generation with Nano Banana Pro

If you’re into LLMs at all, you’ve probably seen talk about Nano Banana and Nano Banana Pro. The new Gemini media generation model supports both:

// Google provider uses Nano Banana by default (gemini-2.5-flash-image)
// for image generation
final agent = Agent('google');

final imageResult = await agent.generateMedia(
  'Create a b&w drawing of a robot mascot for a developer conference.',
  mimeTypes: const ['image/png'],
);

// Configuring the Google provider with Nano Banana Pro (gemini-3-pro-image-preview)
final agent = Agent('google?media=gemini-3-pro-image-preview');

final imageResult = await agent.generateMedia(
  'Create a 3D robot mascot for a developer conference.',
  mimeTypes: const ['image/png'],
);

The image at the top of this blog post was generated by Nano Banana Pro during one of the test runs.

But here’s where it gets interesting. The dartantic’s media generation isn’t limited to images:

final agent = Agent('google');

// PDF generation - uses Gemini 3 Pro Preview + code execution
final pdfResult = await agent.generateMedia(
  'Create a one-page PDF with the title "Project Status" and '
  'three bullet points summarizing a software project.',
  mimeTypes: const ['application/pdf'],
);

// CSV generation - uses Gemini 3 Pro Preview + code execution
final csvResult = await agent.generateMedia(
  'Create a CSV file with columns: date, users, revenue. '
  'Add 5 rows of sample data.',
  mimeTypes: const ['text/csv'],
);

The media generation models (Google, Anthropic and OpenAI via the responses API) are implemented to route to their image generation if they have one and to their code execution environment if they don’t. For you, pick your provider, send in the prompt + mime type and you’re good to go.

Filling the Gaps

As I build out dartantic, I get to find out each provider’s special” behavior.

Structured Output + Tools: For example, all of the Big 3 support tool calling and structured output. However, only OpenAI (via either the completions or responses APIs) supports tool calling AND structured output in the same request. Neither Google nor Anthropic do. So, inspired by the community (thanks @fatherOfLegends!), I’ve worked around that problem for both the Google and Anthropic providers so you can just do this and good things happen:

class TimeAndTemperature {
  const TimeAndTemperature({required this.time, required this.temperature});
  factory TimeAndTemperature.fromJson(Map<String, dynamic> json) => ...
  static final schema = ...

  final DateTime time;
  final double temperature;
}

final provider = Agent('google'), // or openai or anthropic or ...
  tools: [temperatureTool],
);

final result = await agent.sendFor<TimeAndTemperature>(
  'What is the time and temperature in Portland, OR?',
  outputSchema: TimeAndTemperature.schema,
  outputFromJson: TimeAndTemperature.fromJson,
);

print('time: ${result.output.time}');
print('temperature: ${result.output.temperature}');

I keep an eye out for provider improvements so as the LLMs get better, dartantic gets better, too.

Google Native JSON Schema: For example, Google’s Gemini API now uses native JSON Schema support via responseJsonSchema instead of the custom Schema object conversion. This is an internal change with no API surface changes for you, except that now you can pass in much more interesting JSON schemas - including anyOf, $ref, and other JSON Schema features that weren’t previously supported.

Quality of Life

I’ve also made some smaller improvements based on real-world user feedback. Keep those cards and letters coming!

Custom Headers

Real-world enterprise deployments often need to pass custom headers to API calls - for authentication proxies, request tracing, compliance logging, you name it. For those cases, dartantic 2.0 adds custom header:

final provider = GoogleProvider(
  apiKey: apiKey,
  headers: {
    'X-Request-ID': requestId,
    'X-Tenant-ID': tenantId,
  },
);

This has been plumbed through all of the providers: OpenAI, Google, Anthropic, Mistral, and Ollama. The headers flow through to all API calls, and custom headers can even override internal headers when needed.

Google Function Calling Mode

Also, in case you’d like to control just hard hard you push on Gemini using the tools you pass in, I added functionCallingMode and allowedFunctionNames properties to GoogleChatModelOptions:

final agent = Agent(
  'google',
  chatModelOptions: GoogleChatModelOptions(
    functionCallingMode: GoogleFunctionCallingMode.any, // Force tool calls
    allowedFunctionNames: ['get_weather'], // Limit to specific functions
  ),
  tools: ...
);

Available modes:

  • auto (default): Model decides when to call functions
  • any: Model always calls a function
  • none: Model never calls functions
  • validated: Like auto but validates calls with constrained decoding

Breaking Changes

I took this opportunity in the major version bump to break some things that have been bothering me.

Simplified Provider Lookup

I removed static provider instances, e.g. Providers.google, as being not useful in practice. Either you want the default initialization for a project and the convenience of using a model string, e.g. Agent('claude'), or you want to use the type and create a provider instance with non-defaults, e.g. OpenAIProvider('openai-responses:gpt-5', apiKey: ...). The halfway of having a typed default instance was good for discovery, but if you’re using syntax completion to choose your LLM, now you’ve got two problems. :)

// OLD
final provider = Providers.openai;

// NEW
final provider1 = OpenAIProvider();

Once I removed the static instances, there was no need for an entire type just to look up providers, so I moved that to Agent instead. Also, providers are now created via factory functions, not cached instances.

// OLD
final provider = Providers.get('openai');
final allProviders = Providers.all;
Providers.providerMap['custom'] = MyProvider();

// NEW
final provider = Agent.getProvider('openai');
final allProviders = Agent.allProviders;
Agent.providerFactories['custom'] = MyProvider.new;

Custom providers can be plugged into the new Agent.providersFactories map, so named-based lookup works just like built-in providers.

Removed ProviderCaps

I added ProviderCaps originally to help users drill in on what providers they could use in their apps. However, it really became what are the capabilities of the default model of that provider” because every model on every provider is different and cannot be captured with one enum. It’s still useful for driving tests, so I moved it into the tests and took it out of the provider interface as misleading.

// OLD
final visionProviders = Providers.allWith({ProviderCaps.chatVision});

// NEW
// use Provider.listModels() and choose via ModelInfo instead

For runtime capability discovery, use Provider.listModels() instead - it gives you more accurate per-model information.

Removed Flakey Instrinsic Providers

There are lots and lots of OpenAI-compatible providers in the world, so trying to test Dartantic against all of them is impractical. Plus, most of them don’t do such a great job of actually implementing the features, e.g. multi-turn tool calling.

So, I’ve removed three of them from the list of built-in providers (Together, Google OpenAI-compat, and Ollama OpenAI-compat) and moved them to the openai_compat.dart example. You can still use them and define them in your app - in fact, they can be configured to work exactly like the built-in providers using the new Agent.providerFactories - but they’re not built in and they’re no longer part of the Dartantic testing suite.

I did leave the Open Router provider as built-in via Agent('openrouter') since it’s so popular and they do a good job of implementing the API across their models.

Exposing dartantic_interface from dartantic_ai

The dartantic_interface package is great for building your own providers without pulling in all of Dartantic. However, the way I had it split meant that you had to import both packages into every file that used them both. No more!

// OLD - had to import both packages
import 'package:dartantic_ai/dartantic_ai.dart';
import 'package:dartantic_interface/dartantic_interface.dart';

// NEW - one import does it all
import 'package:dartantic_ai/dartantic_ai.dart';

What’s Next?

I’m continuing to track the LLM provider landscape and add support for new features as they become available. I’ve certainly got plenty on my list to do. : )

If you run into issues or have feature requests, please open an issue on GitHub. And if you build something cool with Dartantic, let me know! I’d love to hear about it.

You can get the details here:

Enjoy!

July 22, 2025 ai

The 5 Stages of AI Grief

The 5 Stages of AI Grief

The future is already here — it’s just not very evenly distributed.” –W. Gibson

As a consultant and speaker, I talk to a lot of software engineers. As AI coding tools have gotten better (and they have gotten much better), I’ve watched engineers move through what feels a lot like the stages of grief.

1. Denial: AI is a fad. It’ll go away soon.”

denial

I still run into a large number of engineers who are convinced that AI coding is just hype that’ll blow over like so many other tech fads (Web 3, blockchain, NFTs, … they have a point). They refuse to even try ChatGPT, Copilot or Claude Code. They’ve been writing code the same way their whole lives and why should they change now?

2. Anger: AI doesn’t work. I tried it two years ago and it sucked.”

anger

Some engineers have tried AI coding tools, but they tried them when they were genuinely pretty bad. GPT-3.5 in early 2023 was… let’s call it inconsistent.” So they tried it, it failed to solve their problem, and they walked away convinced that AI coding was overhyped nonsense.

Or, these are the engineers who have strict company policies about which tools and which LLMs that they can and can’t use, often in-house, self-made stuff that leaves something to be desired. To be fair, these home-grown tools often do suck.

The problem is that these engineers haven’t (or aren’t allowed to) try the current crop of tools. They don’t realize that the AI that couldn’t write a for-loop eighteen months ago can now architect entire applications, if used correctly. But until they get over that hump, they’re not going to see the value in them.

3. Depression: Vibe coding: Build me a healthcare app”

depression

This is the stage where engineers realize AI actually works, but they’re holding it wrong. They’re trying to go from zero to sixty in 3.5 seconds with prompts like Build me a healthcare app” or Fix this bug in my code.”

Vibe coding works great when you’re building something small and self-contained. But then you ship that healthcare app and get hacked with an SQL-injection attack because you’re not an engineer at all and you don’t know what SQL injection is. That’s when the depression sets in.

The tool isn’t the problem. The problem is treating AI like a magic wand instead of a chainsaw. The chainsaw will get the job done, but you’ve got to know what you’re doing or you’re just going to hurt yourself.

4. Bargaining: Describe coding: Let’s design a new feature”

bargaining

I don’t do vibe coding.” OK. Well. I have experimented” with it. But I was younger. And I needed the money. And it was tastefully done!

Now, I’ve grown a bit and realize that there are better ways. I’ve adopted a technique I call describe coding” (patent pending. all rights reserved. I do own the domain name : ). Instead of leaping right into build me a thing that does a thing”, I start with let’s think about this for a minute.” I like AI coding tools that have an explicit planning mode, but even without that mode I use prompts like I’d like to design a … It should work like … It should have these constraints… ask any questions you may have before you start coding.” Actually, that last sentence I use so often, I’ve stored it in a snippet behind a hotkey on my Mac.

I collaborate on the AI until we come up with a design I like. Then the implementation. Then fixing the compiler errors (and the warnings, I’m not an animal!).

Then I dump each substantial feature into a design doc for future use. It will be useful in the future to get myself and the AI on the same page when it’s time for updates/bug fixes.

Then I generate tests and run them till they pass, being careful to really check that the tests model the behavior I want. Then update the docs and produce the PR.

Notice what I’m doing: design => implementation => testing => …

This is the standard practice that we’ve been doing for decades. You could call the use of my traditional software engineering techniques bargaining” but I like to think of it as applying the techniques that really work, but in a new way.

The combo of AI + the application of software engineering is where the magic happens, letting me code in minutes or hours what used to take days or weeks. The AI becomes a super-fast pair programming partner instead of a glorified autocomplete.

Plus, it’s so much frickin’ fun, I can’t stand it! If you only get to this stage, you’ll do well with AI coding.

And there’s still one more stage to go!

5. Acceptance: Orchestrating agent swarms

acceptance

The engineers who’ve made it to acceptance aren’t just using AI to write code. They’re orchestrating whole sets of them. Personally, I like to keep about three projects going at once, switching between them, using different instances of my favorite AI coding agent of the week (things are changing really fast). Even keeping to just a few projects, I still find myself with the oops wrong chat” problem, so that may be my limit.

Some people are pushing the limits, however, by combining AI agents at the command line with tmux and a little known feature of git called worktrees,” which gives each agent instance its own playground.

Once you’re swarming, you’re not a software engineer anymore — now you’re an engineering manager with a team. Swarm mode is where your points really rack up.

1. Denial… Again: Remote coding”

denial again!

Remote coding is giving AI agents in the cloud access to your repo and asking them to do things; they can create entire PRs without you ever seeing the code or the running app.

There is a growing set of AI coding products that provide for the close your eyes and think of England” kind of remote asynchronous coding and it’s not OK with me!

And so the cycle begins again with denial.

The Questions We’re All Asking

When I discuss the advancement of AI into our day to day work with folks, some questions keep coming up:

Will using AI atrophy my knowledge? Some skills will almost certainly atrophy - that’s what happens to muscles that aren’t used. But we lost our tails because we never used them either. If something dries up and falls off due to misuse, how important was it in the first place?

Will AI take my job? I honestly don’t know. But I do know this: software engineers who use AI tooling are going to be more effective and efficient than those who don’t. And when promotion time comes around, or when layoffs happen, effectiveness and efficiency are good qualities to have.

Will AI produce more, different jobs? Every single technology we’ve ever invented has put someone out of work. When Grog the plump caveman lost his cuddling-to-keep-the-tribe-warm job after fire was invented, his extra berries at bedtime were gone.

I don’t know what happened to Grog, but I can tell you that the invention of fire led to pretty much every other job. Should we avoid a new technology because it points to an uncertain future?

Every new technology seems different - this one only destroys jobs and doesn’t create them - because the lost jobs happen first and we can’t see into the future. The Luddites broke into fabric mills and destroyed steam-powered looms because they didn’t want to lose their jobs. Of course, we ended up creating many more jobs with the machines that looms ultimately led to than the looms ever created on their own.

Is AI going to be different? Is it only going to destroy jobs and not create them? I don’t know. But if that were the case, it would be unique in human history.

Where Are We?

All technology advances happen faster now. The spread of electricity took ~45 years to reach 25% of the US population. The internet took ~7 years. AI has been making changes in our society as large as those other two in just 2.5 years. What that means for us puny humans, I don’t know.

But here’s what I do know: the grief you’re experiencing with the oncoming AI is real. We are losing things. We’re already losing knowledge and skills. Artists and other content creators seem to have lost their IP. The new people entering every field that uses AI are going to do things differently; the old ways will be lost.

The question isn’t whether we’re going to lose things with AI. The question is: is what we gain going to be worth it? Will it be a higher standard of living for our children? Will AI help us solve real-world problems that we’ve been unable to solve ourselves? Once we get over the grief, will we be able to experience joy in the new world that we’ve created for ourselves?

I hope so.

The engineers who are curious, who are experimenting, who are moving through these stages thoughtfully - I believe that they’ll do just fine in whatever future we create for ourselves. The ones who get permanently stuck in anger or denial? I think they’ll have a harder time.

The future is already here. What are you going to do about it?

July 20, 2025 flutter ai

Welcome to dartantic_ai 1.0!

Welcome to dartantic_ai 1.0!

Dartantic is an agentic framework designed to make building client and server-side apps in Dart with generative AI easier and more fun!

It works across providers (Google, OpenAI, Anthropic, etc) and runs anywhere your Dart code runs (Flutter desktop, Flutter mobile, Flutter web, CLI, server).

It allows you to write code like this:

// Tools that work together
final tools = [
  Tool(
    name: 'get_current_time',
    description: 'Get the current date and time',
    onCall: (_) async => {'result': DateTime.now().toIso8601String()},
  ),
  Tool(
    name: 'find_events',
    description: 'Find events for a date',
    inputSchema: JsonSchema.object({
      'date': JsonSchema.string(),
    }),
    onCall: (args) async => ..., // find events
  ),
];

// Agent chains tools automatically, no matter what provider you're using,
// e.g. openai, google, openrouter or your custom provider. And if you want to
// specify the model, you can, e.g. "openai:gpt-4o", "google:gemini-2.5-flash" or
// "your-provider:your-model".
final agent = Agent('openai', tools: tools);
final result = await agent.send('What events do I have today?');

// Agent will:
// 1. Call get_current_time to figure out what "today" means
// 2. Extract date from response
// 3. Call find_events with that date
// 4. Return final answer with events

I had all of that working with Gemini and OpenAI LLMs three weeks ago. I just needed to add support for a few more providers and I’d be ready for a 1.0. So I did what anyone would do: I spent three weeks rebuilding dartantic from first principles.

Building on langchain_dart

It was three weeks ago when I first really dove into the most excellent langchain_dart repo from David Miguel Lozano. And when I did, I discovered that he was way ahead of me with features AND providers. There was a lot of Langchain stuff in there of course — David had been very thorough — but it also had a lovely compatibility layer over the set of LLM provider-specific Dart SDK packages (which David also built and maintained). So, on the day after I launched dartantic 0.9.7 at FlutterCon in New York, I sat down with Claude Code and carved my way into David’s Langchain implementation, chipping away until I had extracted that compat-layer.

And on top of that, I built dartantic_ai 1.0.

As you can see from the most epic CHANGELOG ever, I learned a ton from David along the way, including:

  • to use Dart types for typed output on the Agent.sendFor<TOutput> method instead of on the Agent itself so each LLM response can have it’s own type
  • to use Dart types for typed input on tool calls on the parameterized Tool<TInput> type itself
  • to use a parameterized model options parameter so each model can be created in a generic way, but also support provider-specific typed model options
  • to expose a set of static provider instances, e.g. Providers.openai, Providers.anthropic, etc. to make it easy to just grab one without using string names if you don’t want to
  • to expose usage tracking
  • to handle embeddings in chunks
  • and so many other tiny details that just makes dartantic better!

David’s langchain base allowed me to build support for 11x providers, 5x native (Mistral, Anthropic, Google, OpenAI and Ollama) and 6x more OpenAI-compatible configurations (Together, Cohere and Lambda as well as Ollama and Google configurations for their OpenAI-compatible endpoints). All 11x providers handle chat and 5x of them handle embeddings. I started with more OpenAI-compatible configurations, but their implementations were either so weak or so flakey or so both (I’m looking at you, Nvidia) that I dropped them — they couldn’t pass the more than 1100 tests I built out to test dartantic’s support for them. But feel free to drop in your own!

Industrial Strength

On top of David’s langchain work, I then built out a lot of new features for dartantic, including:

  • custom providers that participate in the named lookup just like the built-in providers
  • typed output
  • typed tool input
  • typed output WITH tool calls WITH streaming (progressive JSON rendering anyone?)
  • multi-provider chat-compatible message format
  • thorough logging w/ easy setup and filtering
  • usage tracking
  • and more…

You can see the nitty gritty in the dartantic docs.

What’s Next

I’ve separated out the core dartantic interfaces so that you can build a dartantic provider without depending on all of dartantic and so that I can make sure that dartantic continues to run everywhere that Dart runs. I’m working with the nice folks at Cactus to get their enterprise-grade local mobile-device-optimized LLMs into dartantic as a custom provider. I also want to get a provider for firebase_ai in there for my Flutter peeps who don’t want to mess with API keys in their client apps.

Of course, anyone that wants to can build a dartantic provider. Let me know if you do! I’d love to track them in the docs.

I also have plans to support image generation and audio transcription, as well as the new OpenAI Responses API and context caching to reduce token usage.

And I have big dreams for a dartantic builder that translates Dart types into JSON serialization and JSON schema for you automatically, streamlining the agent creation considerably:

@Agent()
class TacAgent extends Agent {
  TacAgent(super.model);

  @Run()
  Future<TownAndCountry> run(String prompt) => _$TownAndCountryAgentRun(prompt);
  
  @Tool()
  Future<DateTime> getCurrentDateTime() => DateTime.now();
}

I’m tracking my ideas for the future of dartantic on GitHub. Feel free to add your own.

Where Are We

My goal with dartantic isn’t for me to be a one-man band. The idea is that dartantic can grow with the AI needs of the Dart and Flutter community, while maintaining its principles of multi-provider support, multi-platform support and fun!

Want to steer where dartantic goes? Hate something and want it fixed? Get involved! Here’s how:

If you’re building AI-powered apps in Dart or Flutter, give dartantic a try. Switch between providers. Use typed output. Make tool calls. Build agents. Break things. Swear at it. Then come tell me what went wrong.

Welcome to dartantic 1.0. Let’s go break some stuff together.

July 16, 2025 ai

A Young Software Engineer’s Guide to AI

A Young Software Engineer’s Guide to AI

I was on an AI-focused podcast last week talking about how a new software engineer should work differently in this new era of AI. It reminded me of a conversation I had recently with a college grad in their first professional software engineering career. They were asking for advice from a wizened old coder (I was even wearing the suspenders). With the latest in AI coding tools, they were productive for sure, but they didn’t understand all of the code that the AI agent was producing. They were afraid that without being forced to write the code they old-fashioned way for cough 40 years cough, that they weren’t ever really going to understand what they were doing.

I don’t think that they are alone.

With that in mind, I’ve composed a letter for new SWEs just getting started with their careers.

Dear New Software Engineer

Congratulations on landing your first job! I know it’s exciting and terrifying at the same time. You’re probably sitting in an open-plan office (sorry about that), watching veterans write code with instincts honed through years of hands-on experience. Watching them work, you’re probably wondering how you’ll develop that same intuition when the latest corporate mandates encourage you to use AI coding agents. Especially when those agents generate code you don’t understand.

That’s OK. You’re not expected to understand all of the code you see at first, no matter who (or what) wrote it.

But that doesn’t mean you get to shrug your shoulders and press the commit button.

You’re an engineer: it’s your job to stand behind every line of code that you submit.

If you don’t understand what the AI is suggesting, then it’s your job to learn, even if you no longer have to write that code yourself. The good news is you’re in a better position to develop coding instincts than any generation of software engineers before you.

How We Old-Timers Learned

Back in my day, we walked uphill both ways to compile our code. And we were happy!

It was a different time and we used different techniques. We spent countless hours reading other people’s code - in books like Steve McConnell’s Code Complete, in discussion groups, on mailing lists. We debugged other people’s mistakes. We made our own mistakes. We’d argue about design patterns in Friday lunch-and-learns and spend weekends diving into open source projects just to understand how they worked.

The process was slow, sometimes tedious, but it built something invaluable: coding instincts. The ability to look at a piece of code and know that’s just wrong” even before you can articulate why. Having an opinion about what’s okay and what’s not OK in your code. Everyone develops their own instincts, but once you have them, you can spot problems that would take someone else hours to debug.

Learn the Same Way, But Faster

Those same learning techniques still work today. In fact, they work better than ever because you have an all-day, every-day collaboration partner who never gets tired of your questions (sorry, Mike, I know you had other work to do…).

You don’t have to wait for Friday’s lunch-and-learn to discuss the code you read about in Jon Bentley’s Programming Pearls. You can ask your AI to explain itself to you right now. When you see code you don’t understand, ask questions:

  • Why did you choose that approach?
  • What other options were available?
  • Is that the right solution for this specific problem?
  • What are the trade-offs of this approach?
  • Can you show me an alternative way to solve this?
  • What happens if [edge case] occurs?
  • Can you walk through this algorithm step by step?

Here’s a conversation I had recently with my AI-coding-agent-of-the-week:

Me: We (well, you really) just implemented simple hierarchical logging using a naming scheme, e.g. Logger(‘dartantic.chat.models’). I understand that the Dart logging package has a hierarchical logging feature built in. How does that differ from ours and should we be using that instead?

AI: I’ll take a look…

…looks it up…

AI: Here’s some sample code for using the hierarchical logging feature in the logging package:

…ugly, complicated hierarchical detail control code…

What you’ve done in this library is simpler and fits the needs of the library better. Let’s keep using what you’ve already done.”

Me: Thanks. I agree.

I learned about a feature I didn’t know existed, saw how it worked and then decided not to use it. That’s exactly the kind of exploration we’d do in the olden days of yore but now it happens in minutes or hours instead of days or weeks.

Go Further

But you don’t have to stop at learning the way we old-timers; you can go beyond:

  • Explore architectural patterns we never had time to try
  • Experiment with different approaches in minutes instead of days
  • Get instant feedback on code quality and best practices
  • Build prototypes to test ideas that would have taken weeks to implement
  • Write the spec, generate the code, throw away the code, update the spec, generate the code again, etc.

You’re not just getting help from an AI coder. You’re getting help from an AI coder trained on the collective knowledge of every software engineer who ever lived.

The patterns that took years to discover, the insights that made industry legends famous, the hard-won wisdom from decades of production systems - it’s all there. You just have to ask.

Coding Instincts - An Example

As you do this more, something interesting happens. You start developing opinions about the code you’re seeing. When something feels wrong, you know it, even if you can’t immediately explain why.

That’s when conversations like this start happening:

AI: I see the problem. It’s a bug in the underlying library. I’ll write a workaround.

…Me hitting the stop button…

Me: I’m sorry. I don’t believe you. Are you saying that we’ve just discovered a new bug in the underlying library that happens to coincide with the new code we just wrote? That’s as unlikely as getting hit by lightning while winning the lottery. I’d like you to write some diagnostic code to validate that claim before committing to a workaround, please.

…writing and running some diagnostic code…

AI: You’re right! The problem is in our code…

That’s what coding instincts look like. And it’s that gut feeling that you’re developing every time you ask the AI to explain itself, every time you ask for alternatives, every time you push back on its suggestions.

Don’t just accept what the AI gives you. Ask it to write tests for the code it created. Ask it to add diagnostic output to validate its assumptions. Ask it to critique its own solution.

Every time you do this, you’re doing the same thing we did when we spent hours in design review meetings or practiced test-driven development. You’re just doing it faster, with more patience from your mentor,” and with access to more knowledge than any of us ever had.

Keep Asking Why

I know you feel pressure to be immediately productive, to understand everything right away. Don’t get caught up in that. Every software engineer is constantly learning. Even after 40 years of doing things the hard way, I still learn things from AI every day.

If you stay curious and engaged, your coding instincts will develop just fine. In fact, they’ll probably develop faster and be more robust than mine ever were.

Yours sincerely, Chris Sells

P.S. Keep a close eye on AI coders that wrap errors in exception handling in the name of defensive coding” but really just to get the tests passing. They’re sneaky that way…


Older Entries →