Redesigning EventBuzz for engagement that actually works
A full platform redesign across the pre-login website, organizer admin dashboard, and attendee experience — built design-system-first, with Figma Make AI eliminating the traditional high-fidelity step.
What is EventBuzz?
EventBuzz is a browser-based event engagement platform built for conference organizers, corporate trainers, and workshop facilitators. It enables live Q&A, polls, quizzes, and real-time analytics — no app required for attendees.
The platform serves two audiences: organizers who configure and manage sessions, and attendees who participate from any device. When this redesign started, both groups were struggling with the experience.
My role
I led end-to-end design from discovery through handoff. The scope covered three surfaces: the public-facing pre-login website, the organizer admin dashboard, and the attendee engagement view.
A key part of this project was using Figma Make — an AI-powered design tool inside Figma — to generate UI directly from the design system I built. This replaced traditional hand-crafted wireframing with a faster, AI-directed loop.
The ask
Unify the pre-login and post-login experience, simplify the organizer admin, improve the attendee participation flow, and establish a scalable visual system.
What was broken and why it mattered
The original platform accumulated features without a coherent design strategy. User interviews and session recordings revealed three persistent patterns of friction.
Organizer overwhelm
No clear hierarchy in the admin panel. Creating a session, adding a poll, and publishing an event lived in disconnected places. First-time organizers frequently gave up before completing setup.
Attendee drop-off
The attendee view was text-heavy and unresponsive on mobile. Joining a live poll required multiple steps with no sense of live participation happening around you.
Analytics buried
Real-time engagement data was invisible during live events. Post-event reports required spreadsheet export to be useful. Key decisions were made blind.
"I spent 45 minutes trying to create my first event and still didn't know if it was published. I ended up sending a plain email instead."Organizer — User Interview, Research Phase
Understanding both sides of the room
I conducted 12 moderated user interviews split between event organizers and attendees. Key themes were mapped using affinity clustering in Miro.
Organizers wanted speed and confidence — to know their event was live and their audience was engaging. Attendees wanted immediacy — to see their vote reflected instantly and feel the energy of the room.
Session recordings from 80 events confirmed where drop-offs happened. A funnel analysis in Mixpanel pinpointed the exact screens where users abandoned.
Research methods
- 01 12 user interviews — 6 organizers, 6 attendees
- 02 Session recording review (FullStory, 80 sessions)
- 03 Competitive audit — Slido, Mentimeter, Poll Everywhere
- 04 Drop-off funnel analysis (Mixpanel)
- 05 Usability testing — 3 rounds, 5 participants each
Design system first, then AI-generated UI
Instead of going from wireframes to high-fidelity screens by hand, I built the design system first and used Figma Make to generate UI from it — giving every surface a consistent visual language from day one.
Discover
Interviews, session recordings, analytics audit, and competitive review to define the problem space.
Define + Design System
Personas, IA, job stories — and building the core design system: tokens, components, and documentation connector for AI context.
Generate with Figma Make
Used the design system as context to generate UI with Figma Make AI. Iterated on prompts, reviewed output, and refined screens across all surfaces.
Validate
Usability testing with Maze, iteration on edge cases, developer handoff, and launch support.
How Figma Make changed the way I designed
The traditional design process has a long middle step between strategy and high-fidelity screens: wireframes, layout exploration, visual passes, component assembly. This project eliminated that step.
I built a complete design system in Figma — color tokens, typography scale, spacing rules, and a 30+ component library. I also created a documentation connector that gave Figma Make structured context: when to use each component and how surfaces should relate visually.
The AI used the design system as its source of truth. Output was already on-brand and consistent. My work shifted from building pixels to reviewing, directing, and refining AI output.
What the AI workflow changed
- Before Manual wireframing, then a separate visual design pass for every screen
- After Design system built once; Figma Make generated screens directly from it
- Gain Pre-login and post-login stayed visually consistent without extra effort
- Gain 3× more layout directions explored within the same timeline
- Role Designer as director — prompting, reviewing, and shaping AI output
Before and after
Three surfaces drove the design effort. Here is what changed across the pre-login website, the organizer admin dashboard, and the attendee view.
Pre-login websiteGeneric layout with no clear hierarchy. Value proposition buried below the fold.
Visual language disconnected from product UI — one brand before login, another inside the app.
No feature preview or social proof. Organizers couldn't evaluate the platform before signing up.
Single CTA with no path differentiation for organizers vs attendees.
Bold hero with clear value headline, product visual, and primary CTA above the fold on all screen sizes.
Design system tokens applied consistently — same colors, type scale, and component patterns throughout website and dashboard.
Feature showcase section with real UI previews to help organizers evaluate before committing.
Dual entry points — one CTA for organizers, one for attendees joining with a code.
No visible event status. Organizers couldn't tell if an event was live, draft, or published.
Poll creation buried three levels deep inside session settings with no preview.
Analytics only available after the event ended. No real-time view during live sessions.
No onboarding or guidance for first-time organizers. Interface assumed prior knowledge.
Persistent status bar with event state, countdown timer, and one-click publish / go-live toggle.
Poll builder accessible from session view with live preview of how it appears to attendees.
Live analytics panel embedded in the dashboard with key engagement signals updating in real time.
Guided event setup flow with contextual tips and progress indicators for new organizers.
Static, text-heavy interface with no visual feedback when a vote was submitted.
Poor mobile layout. Buttons were too small and content required horizontal scrolling.
Q&A questions disappeared after submission with no confirmation or queue visibility.
Animated vote confirmation and live bar chart showing how the audience voted in real time.
Mobile-first layout with large tap targets, single-column structure, and offline fallback state.
Q&A shows submission confirmation, queue position, and upvote count from other attendees.
Results after launch
The redesigned platform launched in Q3. Key metrics were tracked over the first 60 days post-launch against the previous 60-day baseline.
Organizers completed setup 68% faster on average. The guided flow eliminated the most common drop-off point.
Poll response rates tripled after the attendee view redesign. The animated feedback loop removed friction at every step.
CSAT rose from 61% to 91%. Organizers cited real-time analytics and the event status bar as top improvements.
Support volume dropped 40%, tied directly to contextual onboarding and improved information hierarchy in the admin panel.
What I took away
-
1
Serve two audiences without splitting the experience
Designing for organizers and attendees required careful role separation without creating two disconnected products. Every organizer decision had a downstream effect on the attendee experience.
-
2
Real-time feedback changes behavior
Adding live analytics to the admin view changed how organizers ran their events. Seeing engagement drop prompted them to switch activities. The dashboard became an active tool, not a reporting layer.
-
3
Confidence is a UX outcome
Organizers needed to feel in control. The status bar, preview pane, and guided setup flow all served one goal: giving organizers the confidence to hit go without second-guessing themselves.
-
4
Mobile-first is not optional for participation tools
Attendees joined on phones in meeting rooms and auditoriums. Designing desktop-first and scaling down was the wrong approach. Rebuilding the attendee view mobile-first solved participation drop-off immediately.
-
5
Design systems unlock AI-assisted production
Using Figma Make without a design system would have produced inconsistent, generic output. The system was not just a deliverable — it was the context layer that made AI generation useful. The documentation connector gave the AI the rules it needed to produce on-brand screens without constant correction. This workflow will shape every future project.
See the full experience
Three Figma prototypes cover every surface of the redesign — the pre-login website, the organizer admin dashboard, and the attendee view.