CC: Tweaked

CC: Tweaked

42M Downloads

Speaker noise increases with number of playbacks

HendrikLeuschner opened this issue ยท 6 comments

commented

Minecraft Version

1.19.x

Version

1.101.3

Details

Working on sound synthesis directly in lua.
Working in all the mods 8, on a remote server.
I am creating pcm audio. The example provided in the guide,

local speaker = peripheral.find("speaker")

local buffer = {}
local t, dt = 0, 2 * math.pi * 220 / 48000
for i = 1, 128 * 1024 do
    buffer[i] = math.floor(math.sin(t) * 127)
    t = (t + dt) % (math.pi * 2)
end
speaker.playAudio(buffer)

creates a sine wave as it should, which when played on the server is a little noisier than when running this code in CraftOS.

The problem is that rerunning this code multiple times makes the playback noisier every time.
Interestingly, creating a buffer that is filled with 0 amplitude values and playing that seems to "reset" the noise level of the playback. After reset, I can start increasing the noise level again with multiple runs of the sine wave.

So two points:

  1. Any idea why the base noise level is noticeably higher on the server as compared to CraftOS or in-browser on the guide? (nice feature btw)
  2. Why would the playback noise of the speaker increase when running the same file multiple times?

Thanks a lot!

commented

Could you provide working example code that would replicate this issue? Am I understanding correctly you are just playing same buffer over and over again?

commented

Hi Wojbie.
Working code is the one in the post, which is a straight copy from the CCT guide.
I noticed some stronger noise when running this code on the server as compared to e.g. CraftOS.
For debugging, I ran the script as above multiple times from the command line. Each time, the sound distortion becomes stronger.

For testing, I duplicated the script and set the amplitude to 0 instead, creating basically empty sound of the same length:
buffer[i] = 0
This creates a sort of "resetting" script. After running this new script, the noise level is back to what it was when running the normal sine wave script the first time. Re-running the first script afterwards still increases the noise level, again.
Hope that helps.

commented

Yes this really helps! One more question. Were you waiting for last queued sound to finish before running it again or queuing it en masse while it was playing?

commented

Oh, that's really fun, thank you for reporting this!

It looks like some of the internal properties (charge and strength) of the server-side encoder and client-side decoder get progressively out-of-sync. If we plot the amplitudes each time, we can see that the waves get weirder each time.

A plot of the first 1k samples for a variety of sound waves. The first one looks pretty normal, but later ones start to "wobble" at higher amplitudes (jumping between high and low amplitudes very quickly)

It looks like the core problem here is that we destroy the encoder once audio stops playing, but keep the decoder around. This means that the strength/amplitude resets to 0 on the server, but keeps its old values on the client.

We'll actually see similar issues if a client starts observing a speaker half way through playing a song (as the client and server encoder/decoder are out-of-sync). I think the easiest fix here is just to send the strength and amplitude at the top of each audio packet.

commented

@Wojbie waiting for the buffer to empty

@SquidDev Thanks for the visualization! Judging from the wave plots, this looks like it sounds. Not familiar with the inner workings of the audio packet transmission server->client. What would be the next steps for such a problem in terms of development? Cheers!

commented

any chances of a backport to 1.19.2?