This is a good idea and suggestion, and one we’ve had before, unfortunately it doesn’t work.
The problem is even when you add a delay like this, the latency will still creep back into the “feedback loop” and affect the host. Here’s an example:
Let’s say your YouTube latency is 5 SECONDS. So pretend Crowdpurr has a Latency Delay setting that we can set to 5 SECONDS. Let’s say a question is 15 SECONDS. This is what will happen:
Host says “Let’s start Question 1” and they trigger Question 1. Crowdpurr then delays the question from actually starting for 5 SECONDS. The player than hears the host say “Let’s start Question 1” and Crowdpurr updates at the exact same time due to the Latency Delay. So far so good.
The problem has already occurred because now, the host had to stream for another 5 SECONDS waiting on the Latency Delay to kick off the question. This “waiting” of the host then gets streamed and sent on to the player. The result, from the player’s perspective on the stream, is they see the host doing nothing for 5 SECONDS, then when the 15 SECONDS of the question is up, the host is now behind by 5 SECONDS, and they’ll see the host saying something like “Okay, the Question Timer is almost up, hurry up 5, 4, 3, 2, 1…” even though they already finished the question 5 SECONDS ago.
So the Latency Delay just moves the delay, it doesn’t solve it. Anytime the host is then waiting on the players’ answers (i.e. their feedback) they’ll still be affected by the 5 SECONDS latency delay.
We’re thinking the solution to this will be likely rolling our own embedded real-time streaming WebRTC solution that is zero-latency (same technology as Zoom, WebEx, or Teams) and works right within Crowdpurr. More on this later in the year.
Check out this thread post for a more detailed deep-dive into mitigating latency: