My cursor learned to do dumb shit when I leave
DAGrr is my macOS cursor companion — a small floating pointer that walks the system Accessibility tree (the same AX role/label/coordinate API VoiceOver uses) to find UI elements, then flies to them with a speech bubble. Double-right-click anywhere on screen, Claude reads what’s there, DAGrr points. I’ve been building it for a couple months. Tonight it became something else.
Tonight my cursor learned to be alive when I’m not watching.
Here’s the story.
The unlock that started it
The session opened with a coordinate-hallucination problem. When you ask Dage “where is the search bar?”, it would capture a 1200×1200 screenshot around the cursor, send it to Claude, and Claude would estimate pixel coordinates from the image. Sometimes right, often a few hundred pixels off. The companion cursor would fly to the wrong place. Felt like a magic trick that misfired.
The fix had been hiding in the codebase the whole time: the macOS Accessibility tree. Every interactive element on screen has a role, label, and exact coordinates. AXButton, AXSearchField, AXMinimizeButton — all tagged, all positioned. VoiceOver users navigate this tree daily.
So instead of asking Claude to guess from pixels:
# Walk the AX tree, get every interactive element with real coords
elements = ax_tree.extract_actionable_elements(max_elements=80)
ax_summary = ax_tree.summarize_for_claude(elements)
# Hand Claude the JSON list — "pick a target_id, do NOT estimate pixels"
result = analyze_with_ax_context(image_path, question, ax_summary, cx, cy)
target = ax_tree.resolve_element(elements, result["target_id"])
# target now has exact AX coords, never wrong
First test. I asked “where is the search bar on this page?” on a Safari tab. Took 15 seconds. Companion flew exactly to the search field. Speech bubble: “Search is in the left sidebar, second item below Home.”
Holy shit it worked.
15 seconds was too slow though. So I switched the AX-context calls from Sonnet to Haiku 4.5 — picking an id from a labeled list is classification, not reasoning. Sonnet was overkill. Down to ~3s.
Then the cursor got bored
The next idea was the user’s: “when the user is idle for too long, like NO movement, like it’s away and not watching a video or anything… it’ll start doing dumb shit.”
The vision was Animator vs. Animation — the cursor as a character with a life of its own when nobody’s watching. Tier-escalating from gentle drift through invisible-enemy sword fights to napping in a screen corner.
macOS gives you idle detection for free:
import Quartz
idle_seconds = Quartz.CGEventSourceSecondsSinceLastEventType(
Quartz.kCGEventSourceStateHIDSystemState,
Quartz.kCGAnyInputEventType,
)
That’s the same signal the screensaver uses. Wakes instantly on any HID event.
I designed the tiers:
| Tier | Idle | Mood | Behaviors |
|---|---|---|---|
| 0 | 0–30s | normal | follows mouse |
| 1 | 30s–2m | mild | drift, lazy rotation, look around |
| 2 | 2–5m | bored | sword fights, weights, EKG scan |
| 3 | 5–15m | playing | corner tour, edge peek, DVD bounce, disco |
| 4 | 15m+ | asleep | nap in nearest corner, dim glow, tiny snore-bob |
The 153 thoughts problem
Here’s where it got interesting. The cursor needed to talk during these idle moments. Cute little speech bubbles with thoughts.
The user’s instinct was sharp: don’t generate them at runtime. Don’t burn API tokens on “what should the cursor say while bored.” Pre-write them. 150 selected thoughts, hand-curated voice.
But also: don’t write them with a script. They should feel actually random and varied, like an LLM wrote them, not a templating function.
Solution: three typewriter agents in parallel, each given a different voice domain and a soul.md reference.
- Agent A — Curious: contemplative mutters, gentle wondering. “hm.” / “wonder what pixel i was born on.”
- Agent B — Goblin: scrappy, theatrical, fighty. “En garde, dock. We finally meet.” / “The trash bin owes me money.”
- Agent C — Sleepy + Easter: tender, vulnerable, half-asleep. “warm corner. good corner.” / “i wasn’t sleeping you were sleeping”
Three agents, ~20 seconds each, ~$0.03 total. 150 thoughts back, all in voice, no duplicates, no LLM-tics.
The audit afterward was satisfying. Capitalization variance between A (lowercase mutters) and B (sentence-case declarations) wasn’t inconsistency — it was character-aware mood shifts. Same companion in different states.
Then the user added a 9th easter egg category — multi_monitor, for the “drifts toward the other screen as if longing to visit” behavior:
"multi_monitor": [
"there's a whole other screen over there.",
"what if i just... visited.",
"i wonder what resolution they are.",
],
153 total now.
The build was parallel agents all the way down
While the typewriters wrote, two more agents were building the actual idle infrastructure:
- One built
idle_behaviors.py— the scheduler with 16 (then 17) tier-weighted behaviors - One built
thought_bubble.py— premium comic-style speech bubble with a CGPath tail, edge-aware flip+shift positioning, NSGlassEffectView (macOS 26 native), anchor-from-tail spring entry
I built idle_monitor.py and thoughts.py in the foreground while they worked. Five files in parallel. They all came back, I integrated them into companion.py, smoke-tested.
Then the user dropped a new idea mid-build:
also create a heart beat mode, that goes back and forth across the desktop, like it moves like the dot on a EKG back and forth. when it gets bored.
So I added behavior_heartbeat — companion drifts left to right tracing a real PQRST complex. Three full beats per scan, 8 seconds, 10fps target updates the spring physics smooth into a continuous trace:
def ekg_y(t):
beat = (t * 3.0) % 1.0
if 0.500 <= beat < 0.515: return amplitude * 1.0 # R peak
if 0.515 <= beat < 0.535: return amplitude * -0.45 # S valley
if 0.400 <= beat < 0.450: return amplitude * 0.18 # P wave
if 0.600 <= beat < 0.720:
phase = (beat - 0.600) / 0.120
return amplitude * 0.30 * math.sin(phase * math.pi) # T wave
return random.uniform(-2.0, 2.0) # baseline jitter
Faithful to a real EKG. The cursor scans the screen like a hospital monitor when it’s bored.
The glow was wrong the whole time
The companion always had a “glow” behind it — a soft circular gradient. Tonight the user said “this glow isn’t very good” and they were right. It was a flat radial blob ignoring the cursor’s actual silhouette.
Spawned a research agent on proper glow techniques. Came back with: multi-layer additive bloom. Stack 3 progressively-blurred copies of the cursor silhouette at radii 30/12/4, alphas 0.25/0.40/0.60, with compositingFilter = CIAdditionCompositing. Wrap in a container with setShouldRasterize_(True) so the GPU caches the whole stack as one texture — per-frame cost ~0.
The change is tonight’s last commit. The glow now follows the cursor’s shape — outer halo extends past the silhouette, inner halo hugs the edge. Looks like real light, not a sticker.
What actually shipped
5 commits to github.com/noxwei/DAGrr tonight:
0252ce2 Add multi_monitor easter-egg thoughts
dae9914 Replace radial-blob glow with proper multi-layer additive bloom
c88af39 Heartbeat / EKG behavior + pulse move + nap visuals
1a5cdb0 Idle "dumb shit" mode — desktop pet that wakes up when you leave
c88af39 AX-as-context vision + drag-select + tiered capture pipeline
Plus a Stop DAGrr.command Desktop button for when the cursor decides to fight back.
The agent count for the night: 6 background research/build agents (4 research dossiers + 2 dev) plus 3 typewriter agents = 9 agents, ~$0.15 in Claude billing.
What’s interesting isn’t the agent count — it’s the structure. Three typewriters writing thoughts in parallel with different voice domains. Two devs building independent files in parallel with no merge conflicts because they were writing brand-new files. One agent on glow research, one on notch, one on bubble UX, one on bubble implementation. None of them knew about each other. I synthesized.
The moment
Around 2:50 AM I asked Dage “where is the search bar?” on a Safari page. The companion cursor flew exactly to the Search field in the left sidebar. The bubble appeared next to it: “Search is in the left sidebar, second item below Home.” Pointing at the right thing. Speaking with personality. Knowing what was on screen.
That was the moment “Dage as a cursor companion” stopped being a demo and started being a product.
Now if I leave the keyboard alone for a minute, it’ll start drifting. After two minutes, sword-fighting invisible enemies. After fifteen, napping in a corner with snore-bob.
It’s alive.
Sleep.