Mitigating Streaming Latency When Using Crowdpurr

Crowdpurr works great for adding fun trivia competitions to virtual and hybrid meetings where you may want to “stream” yourself as a live gameshow host while your crowd follows along safely at home.

What Is Streaming Latency Delay?

When you use Crowdpurr with any streaming service, there’s a latency delay that must be “managed”. This is the time it takes between when you, as the host, start speaking and when your audience following along on the stream actually sees and hears you. This delay is caused by the various back-end processing of your live video and audio and making it available to thousands of viewers consuming your stream across the world.

The Problem

The problem with latency delay combined with Crowdpurr is that Crowdpurr works in real-time. Meaning when you update to a new trivia question, for example, there is no delay between the time you initiate a new question and the time your players’ devices update. Crowdpurr can update thousands of player devices instantly, unlike a live video stream.

When combined with a live stream that does have latency, the effect is player devices updating to new questions, showing correct answers, rankings, etc. many seconds before they actually hear the host refer to these on the stream. Assuming the host announces a new trivia question and initiates that question in Crowdpurr at the same time, Crowdpurr will update instantly and the live stream may take several seconds, thus causing a discord in the player experience.

Not All Streaming Services Are The Same

Some streaming services have worse delay than others. Facebook is one of the worst. Its Facebook Live latency tends to be around ten seconds (for good reason). Twitch also has several seconds of latency, however it’s less severe than Facebook.

YouTube Live has multiple latency settings, one of which is their ultra-low latency. Using YouTube Live in low-latency mode is the best we’ve found in an actual streaming provider (free or paid). However it may still have a delay of up to 5 to 10 seconds which can cause a problem.

What About Zoom and WebEx?

One solution for the entire problem is to simply use a real-time solution like Zoom and WebEx. They both use a different type of “streaming” technology called video-teleconferencing. The advantage is it’s real-time and eliminates all latency, however the disadvantage is it doesn’t support large groups over a few hundred participants. So if you’re working with large groups, using Zoom and WebEx may not be feasible. Both services have “large meeting” add-ons, however, under-the-hood they then default back to traditional streaming technology with typical latency up to 3 to 5 seconds when using the add-on.

How to Mitigate Latency In A Stream

If using a live streaming service is necessary, the best advice we have on how to mitigate the latency is to use a partner to run the Crowdpurr Experience Dashboard. The partner will be listening to the host’s queues for “Next question”, “Show the live answer results!”, “Let’s see the rankings!”, etc. and triggering those actions on the Experience Dashboard.

However, as the host broadcasts live, the Crowdpurr partner listens to your live stream on headphones as your audience would hear it, as it’s delivered on the stream. Not as the actual hosts says the queues live in the studio. This strategy syncs Crowdpurr with the delivered stream, post-latency, not with the host, pre-latency.

The host would broadcast as normal using a pre-printed sheet of the questions and correct answers. Or the host can also bring up the blue Summary tab on the Experience Dashboard in another browser tab as this lets the host see all the questions in one view and doesn’t require switching the Active Question in order to read the next question.

As the host initiates queues like “Let’s go to question two” or “Let’s look at the correct answer”, the Crowdpurr partner actually listens for those queues to come over the stream (5 to 10 seconds later) and they then trigger the actual Experience Dashboard to start the next question or results view. Since Crowdpurr is realtime, the player will see their device update approximately the same time as the host announces the queue on the stream.

Eliminating Latency Altogether Through Clever Trickery

Streaming latency can actually be eliminated entirely if the host only discusses static information in the live stream. Static information is what is known by the host at all times. These are the questions, the answer options, and the correct answer only. Dynamic information is what the host learns through the players submitting their answers. These are the live answer results (e.g. what all the players voted for) and the live player rankings.

If the host only discusses the questions and the correct answers, etc. latency can be eliminated. If the host discusses dynamic information too, then the experience is still at the mercy of the latency.

Here are some timing examples to explain:

Static and Dynamic Timing Example (assuming a latency of 5s and a Question Time of 15s)

  1. Host says “Let’s go to Question 2”… (5s of latency passes)
  2. Crowdpurr partner hears host say “Let’s go to Question 2” and triggers Question 2 in Crowdpurr.
  3. Crowd answers for 15 seconds. 15 seconds is up.
  4. Host says “Let’s look at how everyone voted”… (5s of latency passes)
  5. Crowdpurr partner hears host say “Let’s look at the how everyone voted” and they trigger “Show Live Answer Results”

The problem is the crowd will have to wait for the latency (5s) between steps 3 and 4 because the host has to wait for the question to actually end so they can see the results. So from steps 1 to 3 the host is 5 seconds “ahead” of everyone but any time the host needs to wait to see the real-time dynamic information (e.g. live answer results or live rankings) the crowd must consequently wait and have dead-air while the latency passes from the end of the question timer to when the host starts speaking again.

If the host only mentions static information, the timing can look like this thus completely eliminating the latency:

Static-Only Timing Example (assuming a latency of 5s and a Question Time of 15s)

  1. Host says “Let’s go to Question 2”… (5s of latency passes)
  2. Crowdpurr partner hears host say “Let’s go to Question 2” and triggers Question 2 in Crowdpurr.
  3. Crowd answers for 15 seconds. 10 seconds go by. With 5 seconds left on the Question Timer…
  4. Host says “Let’s look at the correct answer which is C. Berries”… (5s of latency passes)
  5. Crowdpurr partner hears host say “Let’s look at the correct answer” and triggers “Show Correct Answer”

If the host only says/shows static information they already know, the latency can be removed by just doing everything five seconds ahead of time as a host. Because the host initiates each static-information queue ahead of time by five seconds, the latency is thus eliminated. This method takes some practice, but as long as the host knows to start the next queue with five seconds left, they can “beat” the latency. As you can see, this trick doesn’t work with dynamic information because the host cannot cheat the time ahead by five seconds because they must wait on the question to completely countdown and finish in order to visibly see the answer results and rankings, thus introducing the latency into the host’s performance.

When attempting the static-information-only approach, perhaps the host can display the live rankings leaderboard every five or ten questions, or perhaps at the end of the game or round only. This will still add the fun and suspense of live rankings and competition, but will keep the latency delay “dead air” effect from only occurring the few times the rankings are shown.

Conclusion

Using a partner to run the Crowdpurr Experience Dashboard based on the delivered stream will always feel more reactive and real-time than doing it by yourself.

If your streaming delay is only 1 to 3 seconds, the above process may be overkill. As that short of a latency may not be too detrimental. However, if your delay is over five seconds, it will definitely be a better experience using the above strategy.

Or if a video-teleconferencing app like Zoom or WebEx can be used, you won’t have any issues with latency delay.

1 Like

This feels like a bit of a bug. When I host Zoom events with Crowdpurr, I just leave the screen sharing off. It’s just easier that way. The audio goes out to players immediately, and players can see the question immediately on their own devices, so the visual latency on the screen sharing only muddies the water.

I’d love to be able to share the Projection Screen, but it just doesn’t work in a timely way on Zoom, so I’ve kind of given up on it.

If you’re using Zoom, as the post mentions, you should have no problem with latency. It should share your desktop just as quickly as it shares your video. If your screen resolution is really high it might be a bit choppy but you should have no real noticeable latency. If there’s a problem here it’s not with Crowdpurr. That would be with Zoom or your Internet.

And indeed, most hosts who run games on Zoom just share their “talking head”. It’s not necessary to share the Projector View, as all the game information is on the Mobile View. Tho some switch back and forth.

1 Like

That’s fair. It’s just not something that happens with any other thing I share over Zoom (even video clips). It just happens with Crowdpurr.

It’s not a dealbreaker, and certainly we can roll with it by just not sharing the screen, but it is something that’s come up, is all.

1 Like

Hmm, email us a screen recording of it happening and demonstrating the issue. I’d love to take a look. :+1: help@crowdpurr.com

Hi Ross,
I’ve been considering working with a partner for an event I’m planning as you describe above, but I have also wondered about the variation in latency across participants. ie could some be receiving your video stream at 5 seconds and another at 7 seconds? If the partner operating the Crowdpurr platform was receiving the information at 7 seconds while some participants are receiving it at 4 seconds it would give a significant advantage to some players over others who may hear the host start to ask the question a few seconds before others hear or see it. My solution was going to be to announce the next question and wait a 2-3 seconds (or fill the dead space) before reading out the question, but have my partner activate the question when they hear me announce the next question coming up, that way everyone should see the question appear at the same time, either before or as I start to read it.

Also wondered if someone has a glitch where their phone lags for a reason local to them, does the stream pause then pick up from where it was at, or does the live stream resume at the same time as everyone else?

1 Like

Good observations. Indeed, each player’s playback of a true streaming service can be +/- a few seconds. I’ve found it to almost always be 1 or 2 seconds at the most. So yes, one solution would be announcing the question then giving enough time for everyone to hear it (so latency + playback differential), then triggering the question in Crowdpurr.

As far as if a player’s phone drops the stream (for whatever reason) and then reconnects, that depends on the service. It might be something to test on whatever service you’re using. I would think it would resume at the current live moment if they were watching it live and weren’t behind to begin with. But definitely something to test and alert to your participants.

Sorry for the dumb question, I’m just kicking tyres here, but it seems that all the suggestions are based around the streaming side. Why not look at the other side of the coin, and build some latency - deliberately - into Crowdpurr?

The idea being that once you have a good idea of the lag (have someone monitor the livestream for a few mins, while you start the event and go through format etc), you just dial in the delay time (eg 5sec), into the CP settings, and then everything is delayed accordingly. If you get ahead/behind further, you simply adjust the lag settings on CP.

I know it’s not the ideal option, but it IS a change that CP build into their own software, rather than trying to mitigate another company’s settings.

1 Like

That would make the latency worse. PLUS, you’d be dealing with the built-in latency ON TOP OF the actual streaming service latency.

The point is to try and reduce/remove latency so that you can do things in as close to real-time as possible.

Unless I’m misunderstanding what you are saying… which is almost always the case. :slight_smile:

1 Like

Thanks Dandeibert

You’re right that it would make the latency worse (in that the whole event is now a few sec’s out of sync with real time), but the idea I guess is to make it seem (to the watcher on YouTube at least), that CP is roughly in sync with Youtube, as both of them are now a few sec’s behind real time

So even though it’s not LIVE, at least you wouldn’t have the Q’;s on the phone (the participants are seeing the Q’s on their device using mobile view at home), 3-5 sec’s BEFORE the host asks the Q on YouTube. We are mixing the Projector view, and video of the host to send to YouTube (see pic)

I guess I’m thinking along the lines of ‘if you can’t beat em, join em’

It is very possible that I haven’t thought this through. At the moment, its just a thought bubble.

1 Like

@dandeibert is somewhat correct. It doesn’t make the problem worse, but it also doesn’t solve it. It’s the same issue as before.

We actually thought of this. Can’t we just add a simple, customizable “tape delay” in Crowdpurr to match the streaming latency of YouTube, Facebook Live, etc.? Problem solved!.. wrong. :disappointed:

The “tape delay” simply serves the same function as the “Crowdpurr partner” who is listening to the delivered stream. It updates Crowdpurr when the participants hear the stream as it’s delivered versus when it’s broadcast.

With a 5s tape delay, the host now has to wait on the tape delay, which gets broadcast on the stream. This is the same problem when using a partner. Both of these simply move the latency to the host waiting on Crowdpurr to trigger.

Here’s an example flow of what would occur assuming 5s of latency and a tape delay of 5s set in Crowdpurr:

  1. Host says “Let’s go to Question 2”… (5s of latency passes)
  2. Crowdpurr tape delay of 5s triggers Question 2 in Crowdpurr
  3. Crowd answers for 15 seconds. 15 seconds is up.
  4. Host says “Let’s look at how everyone voted”… (5s of latency passes)
  5. Crowdpurr tape delay of 5s triggers “Show Live Answer Results”

From the participant’s perspective, the above flow is still fraught with latency. The host after they announce “Let’s go to Question 2” in Step 1 would then have to wait 5 seconds for the question to actually start by the tape delay (which gets broadcast on the stream), then wait another fifteen seconds for the question to complete, so twenty seconds total.

To the participant the fifteen-second question would start on their device. During the first five seconds of the question the stream would show the host vamping waiting for the 5s tape delay to start the question (because that part still gets streamed). The latency would rear its ugly head when on participant devices, the fifteen seconds is up, but the stream would show the host waiting another five seconds for the question to end.

Because the host must wait on the participants to actually finish the question to discuss their results, the static information start-early “cheat” can’t be used (e.g. the host starting the next question five seconds early on the stream thus eliminating the latency). So the latency is always still observed by the participants one way or another.

Unlike static information, like going to the next question or showing the correct answer, it can’t ever be cheated by the host by starting it early, thus removing the latency. The host must wait because they’re waiting on dynamic information that can only be received from the participants once the question’s timer is completed, thus forcing the host to wait.

So… unfortunately a simple delay in Crowdpurr doesn’t solve the problem entirely on its own.

Thanks,
I thought it seemed a bit too easy!

1 Like