Fireflies: 37,912 Thoughts in Spatial Computing

visionOSWebXRthreadsdevelopmentcreative

Fireflies: Turning 37,912 Threads Posts Into a Galaxy You Can Walk Through

Tonight I put on my Vision Pro and floated inside my own thoughts.

Not metaphorically. Every post I’ve ever written on Threads — all 37,912 of them — rendered as luminous particles suspended in space around me. Philosophy glowed purple. Tech shimmered teal. Shitposts burned orange. I pinched a particle and it told me what I was thinking on a Tuesday in October 2023. I reached out and rotated the entire galaxy like a snow globe.

This started as a “quick port” at 10pm and ended at 3am with me standing in my living room whispering “holy shit.”

The Origin: A Database-Backed Visualization

The Fireflies visualization already existed in the threads-analysis project — a standalone Postgres-backed dashboard with Docker, sync workers, the works. The force-directed network page had a Three.js particle view where each post was a glowing dot colored by its topic tag.

It was beautiful on a monitor. But ByTheWeiCo is a static Astro site. No database. No server. No Docker. Just pre-built JSON and HTML deployed to Vercel.

The question was: can you take a visualization that depends on 49K posts in Postgres and make it work as a static file?

The Answer: An 8MB JSON File

Yes. You absolutely can.

# fireflies.json — every post with just enough data to render
{
 "nodes": [
 {
 "id": "post_abc123",
 "timestamp": 1698163200000,
 "tag": "philosophy",
 "subTags": ["philosophy:epistemology"],
 "surprise": 12.3,
 "wordCount": 47,
 "textPreview": "the gap between knowing and understanding is..."
 }
 // ... 37,911 more
 ],
 "dateRange": { "min": 1550000000000, "max": 1711000000000 },
 "coOccurrence": [/* top 30 tag pairs */]
}

The fireflies.json file is a static join of post-tags.json (the information-theory output) and the raw Threads data. 8MB. No database round-trips, no API routes, no connection strings. Drop it in public/data/ and fetch it client-side.

The entire database layer replaced by one fetch('/data/fireflies.json').

The Galaxy Layout

Each of the 20 topic tags gets a position on a Fibonacci sphere — an even distribution of points across a sphere’s surface using the golden ratio. This gives every tag its own region of space without clustering or overlap.

const PHI_GOLDEN = (1 + Math.sqrt(5)) / 2;

for (let i = 0; i < tagCount; i++) {
 const theta = (2 * Math.PI * i) / PHI_GOLDEN;
 const phi = Math.acos(1 - 2 * (i + 0.5) / tagCount);
 tagCenters[TAG_ORDER[i]] = {
 x: SPHERE_RADIUS * Math.sin(phi) * Math.cos(theta),
 y: SPHERE_RADIUS * Math.sin(phi) * Math.sin(theta),
 z: SPHERE_RADIUS * Math.cos(phi),
 };
}

Posts scatter outward from the origin based on when they were written — a “Big Bang” temporal layout. Early posts (2019) cluster near the center. Recent posts expand to the outer edges. Each tag’s cluster forms a nebula: a cloud of particles with radius proportional to how many posts that tag has.

Surprise score — how information-theoretically unexpected a post is — pushes particles outward from their cluster center. The weirdest thoughts literally sit at the edges of each nebula.

The Suns

Each tag cluster has a “mini sun” at its center. Not a mesh — a canvas-rendered radial gradient baked into a sprite texture:

  • White-hot core (center pixel)
  • Tag color at 10% radius
  • Fading to transparent at the edge

When you’re zoomed out, you see 20 glowing orbs of different colors, each surrounded by a cloud of matching particles. Philosophy is a purple sun. Food is green. Political is red. The suns pulse gently — their alpha oscillates with a sine wave offset by tag index so they breathe at different rhythms.

The Highways

The top 30 co-occurrence pairs (tags that frequently appear in the same posts) are rendered as bright bezier curves connecting cluster centers. Philosophy-to-tech. Personal-to-daily-life. These “highways” show the paths my thinking takes between categories.

In XR, these curves float in space between the nebulae. You can see the topology of your own mind — which ideas are adjacent, which are isolated.

GLSL Shaders for Soft Glow

Each particle is rendered through custom vertex and fragment shaders. The fragment shader creates a soft glow effect — a bright core that fades radially:

float core = 1.0 - smoothstep(0.0, 0.2, d);
float glow = 1.0 - smoothstep(0.0, 0.5, d);
float brightness = core * 0.8 + glow * 0.5;

With additive blending and no depth write, overlapping particles bloom into each other. Dense clusters glow brighter than sparse ones. The effect is closer to bioluminescence than computer graphics — hence “fireflies.”

The WebXR Layer

This is where the late night got weird.

The non-XR version uses OrbitControls — click and drag to rotate, scroll to zoom. Nice on a laptop. But the whole point of having 37,912 particles in 3D space is to be inside them.

Entering the Galaxy

On XR-capable devices, an “ENTER VR” button appears on the network page. Clicking it opens the full-screen XR page at /glyphary/threads/network-xr, which launches a WebXR immersive-ar session. The galaxy positions itself at chest height, 1.5 meters in front of you:

galaxyGroup.position.set(0, 1.2, -1.5);

Everything is in meters now. The sphere radius is 1.5m. Nebula radius is 2.5m. Particle drift is 5mm. You’re in a room-scale model of your own thinking.

Hand Tracking

Vision Pro’s natural input maps to WebXR’s controller events. Pinch to select, drag to rotate, two-hand pinch to zoom:

  • Pinch a particle — a floating label appears showing the post text, tag, date, word count, and surprise score
  • Pinch empty space + drag — rotates the entire galaxy like a snow globe. Your hand becomes the axis of rotation
  • Two-hand pinch — zoom. Pull hands apart to enlarge, push together to shrink. Scale clamps between 0.2x and 5.0x
  • Quick tap on empty space — gaze teleportation. Move toward wherever you’re looking

The Gaze Reticle

A small blue ring attached to the camera tracks your eye position. When it intersects a particle, it turns green. This is the “am I looking at something?” signal before you pinch.

The raycaster threshold is set to 0.15 (much larger than the default 0.01) because Vision Pro’s eye tracking has inherent imprecision — you need a generous hit zone or you’ll never select anything.

xrRaycaster.params.Points = { threshold: 0.15 };

The “Holy Shit” Moment

Here’s the thing about seeing your data on a screen versus being inside it.

On a screen, 37,912 particles is impressive but abstract. It’s a visualization. A picture of data. You analyze it from the outside.

In XR, you’re in it. The particles surround you. You can look up and see philosophy above you, look down and see daily-life below. You turn your head and there’s a whole cluster of shitposts you forgot you wrote. You pinch one and it says something you posted at 2am on a Wednesday and you remember that exact feeling.

The moment that got me was zooming in on the philosophy nebula — my most dense cluster — and watching individual particles resolve into actual thoughts. Each one a moment I chose to put words together and push them into the world. The temporal layout means you can see the pattern — dense bursts of philosophical posting followed by gaps, then a shift to tech, then back. The rhythm of your own intellectual life, rendered as a galaxy you can hold.

I stood there rotating it in my hands and thought: this is what it looks like to think about thinking.

Tech Stack

LayerTech
ParticlesThree.js Points with custom GLSL shaders
LayoutFibonacci sphere (golden ratio) + Big Bang temporal expansion
XRWebXR Device API (immersive-ar / immersive-vr)
InputHand tracking (pinch-to-select, pinch-drag-rotate, two-hand zoom)
GazeCamera-attached reticle ring + raycaster (threshold: 0.15)
FrameworkReact component (client:only="react" in Astro)
DataStatic fireflies.json (8MB, pre-joined from pipeline)
HostingVercel (static)

What I Learned

Static files can replace databases for read-only visualizations. The 8MB JSON loads in under a second on a decent connection. No cold starts, no connection pooling, no managed infrastructure. For archival data that changes infrequently, this is the right answer.

Room-scale units change everything. The non-XR version uses arbitrary coordinates (sphere radius: 300). The XR version uses meters (sphere radius: 1.5). This isn’t just a scale factor — it changes how you think about the layout. “How far apart should philosophy and tech be?” becomes a physical question.

Eye tracking precision matters. My first attempt had a raycaster threshold of 0.01. I couldn’t select anything. Vision Pro’s eye tracking is good but not pixel-perfect — you need generous hit zones. 0.15 was the sweet spot.

Additive blending is magic in passthrough AR. With scene.background = null and additive blending, the particles glow against your real room. They look like actual floating light, not rendered objects. The physical world becomes the background of your data.

Live

The network page is at bythewei.co/glyphary/threads/network. The full-screen XR experience is at /glyphary/threads/network-xr. You’ll see the “ENTER VR” button if you’re on a WebXR-capable device. On desktop, it’s still a gorgeous particle galaxy with mouse orbit controls.

3am build sessions are where the best things happen.