Overview
Daydream's Stream Diffusion service was used to transform the traditional bird's-eye livestream of BMwebcast into an interactive AI-powered experience. Instead of passively watching, audiences actively shaped the broadcast by voting on prompts that altered the live video feed in real time.
Challenge
The BMwebcast livestream is popular for its sweeping aerial coverage, but it's historically been a one-way experience. We wanted to explore how Daydream could make large-scale cultural events more engaging by giving audiences a voice in what they see, taking cues from interactive formats like Daydream Live.
Solution
Using the Daydream Stream Diffusion service, the BMwebcast livestream was transformed in real time using artistic style reference images chosen through audience voting. This turned a standard festival stream into an evolving, community-directed artwork. This experience was created within a single week, and served 100,000s of viewers.
Results
Engagement Rate
Compared to traditional livestreams, viewer engagement increased dramatically with interactive AI features.
Processing Latency
Average time from prompt submission to AI-transformed video output on the livestream. Learn how to build this with the Daydream docs.
Key Achievements
- Demonstrated real-time AI video processing at scale during a major cultural event
- Showcased scalable infrastructure handling thousands of concurrent AI transformations
- Proved audience co-creation can redefine live entertainment
Takeaway
The BMwebcast × Daydream Live project proved that cultural broadcasts don't have to be static. With Daydream, audiences can be co-creators — shaping events in real time and experiencing live video in a whole new way.