Case Study 01 /

KAIBER VIDEO EDITOR

Role

Lead PM

Tools

React,

Users on Superstudio were generating tons of short AI video clips (5 to 15 seconds each) but had no way to package them into something longer. The canvas was great for arranging ideas and sketching out a narrative, but the moment users wanted to actually assemble a video, they had to leave the platform entirely.

I began by validating this problem with real users. We talked to a mix:

  • Users who had adopted Superstudio as their primary creation tool
  • High-volume creators who were actively downloading and using their generated clips
  • High-volume creators who generated content but then dropped off and didn't come back

All three groups confirmed the same need: they wanted to assemble content on-platform. They also shared a surprising point of agreement about their current workarounds. Tools like CapCut and After Effects weren't failing because they lacked features, they were failing because they had too many. Users didn't need a full editing suite, they needed to arrange clips, add music, and export.

Based on these insights, we ran a design sprint to discover the most appropriate system for incorporating this large of a feature into an already complex product. Our first round of prototypes focused on the relationship between the canvas and the editor. Do users prefer the fluidity of a node or panel editor, or do they prefer the familiar interface of a dedicated surface? We also wanted to understand the users’ creative process: do they think of generation/creation and editing as separate, linear processes, or are they interlinked enough that they should share a space?

Slide image
Slide image
Slide image
1 / 3

While our power users showed interest in a node system for fluid creation and assembly, users overwhelmingly responded positively to the familiar fullscreen interface. We discovered that our users thought of editing as a “last step”, the stage before finishing a project. Crafting prompts and generating media was a separate workflow and users wanted to keep these two creative approaches separate. In our first round of research, a subset of users introduced a different unexpected need: the desire to add music to their clips with as little effort and input as possible. No timeline, minimal control, just dead-simple set and forget audio + video arrangement. We ran a parallel sprint to discover whether users responded to a more guided approach to video editing by comparing a step-by-step workflow to the familiar timeline approach.

Case study image

We expected that the more familiar a user was with editing tools, the less interested they would be in a guided workflow. In actuality, there was an almost universal desire for easier ways to arrange music and video, but not at the expense of more detailed editing when necessary. Pro users could quickly throw together a draft edit for later fine-tuning and greener users could skip the timeline entirely.

We built a two-pronged solution. The first was a familiar video editor interface stripped down to only what mattered to users: timeline, asset library linked to the canvas, music, export. No bloated toolbars, no learning curve. The second was an auto-edit feature for users who just wanted their clips synced to the beat of a track with minimal effort.

Both features lived alongside the existing canvas, so users could flip between ideation and production without switching contexts. A persistent asset library tied everything together, keeping clips and files consistent and accessible across both surfaces.

The original approach for the auto-edit feature was LLM based. An agent would dynamically arrange media based on perceived energy and serve the user a dynamically arranged video based on their audio. While the results in this approach were successful, the amount of time the LLM required to process media created confusion and a mismatch of expectations for the user.

The solution we shipped used the Remotion library, which allowed for near instantaneous auto-sync edits and realtime effects + timing editing. Users found this approach to align with what they expect from an automatic editor.

Superstudio's free to paid subscription conversion rate generally hovered around 2%. Within a few weeks of launching the editor, free to paid conversions of users that opened the editor grew to 15%, specifically from CTAs within the timeline and video editor user journeys. We also saw paid trial to subscription conversions increase to 33%, up from 20%.

+33%
Paid trial to subscription
+15%
Free to paid conversion