WEBVTT

00:00.000 --> 00:12.640
Right. Hi. I'm Merlin. Yeah, I'm excited to be here today. Well, so many people. Yeah,

00:12.640 --> 00:17.840
wasn't originally scheduled to speak today, but I jumped in as a backup. So, haven't

00:17.840 --> 00:21.640
had quite as much time to prepare as I'd like. So, I hope you forgive me if it's a little

00:21.640 --> 00:32.360
rough on the edges. But yeah, I want to talk to you about live coding with Kotlin and

00:32.360 --> 00:39.080
Compose Multiplot Form. Yeah, so I'm a full second engineer. I do a good bit of Android

00:39.080 --> 00:48.440
development. I've had a casual interest in live coding for some time. And one thing that I work

00:48.440 --> 00:54.120
with a lot, the programming language I work with the most is Kotlin, and I work with this Compose

00:54.120 --> 01:00.040
UI framework. And I have this idea to repurpose it for live coding. So, I have a couple

01:00.040 --> 01:10.200
questions for you. Who here has used Kotlin? All right, quite a few. Who has used Compose

01:10.200 --> 01:17.720
Multiplot Form or Jetpack Compose for Android? Fewer, but a few. Okay. And he has used React.

01:19.400 --> 01:26.200
More people. Okay, great. So, I will draw some analogies between Compose and React throughout

01:26.200 --> 01:32.600
the talk. I hope that will help some of you make the content concepts more accessible for

01:32.600 --> 01:43.240
some of you. Yeah, so let me show you what I will be talking about. Oh, this is not.

01:43.880 --> 01:52.360
No, I have the slides here, but I can't. Can I? No.

02:04.600 --> 02:13.160
So, about this. There we go. Okay. So, yeah, what I want to talk about is firstly,

02:13.160 --> 02:18.840
Kotlin and the Compose Multiplot Form, then a little bit about the function or declarative

02:18.840 --> 02:28.920
paradigm, which underlies Compose, and which is also popular with live coding, then I will

02:28.920 --> 02:35.320
introduce the little library I have been building, which I call the couple. And talk about

02:35.320 --> 02:42.120
why am I repurposing a UI framework for music for live coding. I talk a little bit about

02:42.120 --> 02:48.040
the fundamental building blocks of UI and draw some parallels with fundamental building blocks

02:48.040 --> 02:54.600
of music. I talk about state management and how the couple works under the hood.

02:56.600 --> 03:01.960
I talk about implementing these musical building blocks, DSL for expressing rhythm

03:02.920 --> 03:10.440
and connecting up building blocks together. Yeah, and then I talk about the

03:10.440 --> 03:14.520
Multiplot Form element of it, so Compose Multiplot Form, it works on different platforms,

03:14.520 --> 03:23.000
desktop, Android, iOS, web, and finally integrations with Compose UI and visualizations.

03:27.640 --> 03:36.200
So, firstly, Kotlin and Compose Multiplot Form. So, Compose Multiplot Form is a declarative

03:37.080 --> 03:45.240
UI framework with a functional paradigm. It's similar to React. It's not going to connect

03:45.240 --> 03:52.840
slide. There we go. There are similarities with React, which is a JavaScript framework and Compose

03:52.840 --> 04:00.760
is for Kotlin, basically. Kotlin started out as Java with less boilerplate. It was made by

04:00.760 --> 04:08.680
some frustrated Java developers. Yeah, it's become a flexible Multiplot I'm language with support

04:08.680 --> 04:15.880
for the web, with support even for iOS. So, yeah, that's the Multiplot Form element. Kotlin is

04:15.880 --> 04:22.440
my main language and I'm using Jetpack Compose, which is the Android version of Compose Multiplot

04:22.440 --> 04:26.600
Form and the version that they made first before they made Compose Multiplot Form. I use that a lot.

04:27.320 --> 04:32.040
So, this is something that I'm quite used to using and I thought it would be fun to explore

04:32.040 --> 04:40.440
what else can be done with it. Yeah, so Compose is the standard way to create UI's on Android

04:40.440 --> 04:51.560
and also now with Compose Multiplot Form in Kotlin generally. Yeah, so the functional declarative

04:51.640 --> 04:58.360
paradigm. It's popular with live coding as well. Some prominent examples, Haskell's title,

04:59.320 --> 05:05.880
JavaScript's scoodle, scoodle closure has the overturned, they're all kind of in either functional

05:05.880 --> 05:12.600
language or in the case of JavaScript, it's just a functional framework. So, this functional declarative

05:12.600 --> 05:21.320
paradigm, meaning you specify what you want and not how to go about it. We'll see that in a

05:21.320 --> 05:27.800
minute as well with some examples. Yeah, so while Kotlin is not a functional language, it

05:27.800 --> 05:36.600
supports that paradigm and the UI frameworks compose and react above built on it. And what I noticed

05:36.600 --> 05:41.640
when looking at these different live coding frameworks, it didn't find any that were actually built

05:41.640 --> 05:45.960
on a UI framework like this, so there's nothing for Compose and I checked for react and I didn't

05:45.960 --> 05:52.280
find anything there either which surprised me given the size of the react ecosystem and I wanted to

05:52.280 --> 06:05.800
try this out. Yeah, we've tried this out with what I call a couple. So, leveraging Compose

06:05.800 --> 06:12.040
Mountain platform for audio generation and live coding would be available under the MIT license,

06:12.040 --> 06:20.520
it's still in the early stages. So, it's under heavy development right now and yeah, you can't check

06:20.520 --> 06:25.640
it out already but I would recommend maybe checking it out at some later point when it's a little

06:25.640 --> 06:31.400
further developed. So, we're talking mostly about the concepts underlying it in this talk and like the

06:31.560 --> 06:52.040
particular interesting parts of using a UI framework like this. So, yeah, why am I doing this?

06:52.040 --> 06:58.040
Why would I repurpose a UI framework for live coding? Yeah, it turns out basically that a lot of

06:58.040 --> 07:03.960
the features that are useful for your eye development can be repurposed and it loads the barrier

07:03.960 --> 07:08.920
to entry for developers who are already familiar with the UI framework in this case for Compose

07:08.920 --> 07:15.080
developers. There's a lot of work that you have to do when you make a live coding library and some

07:15.080 --> 07:20.760
of that work can be offloaded to the framework. So, I'm using it for state management, for animating

07:20.840 --> 07:29.080
values, for dynamically adding or removing blocks that build up the audio basically. And yeah,

07:29.080 --> 07:34.440
intelligent and studio also has like previous features for Compose where you can preview the UI

07:34.440 --> 07:45.400
right in the IDE with live updates which is also quite handy. Yeah, so, fundamental building blocks,

07:45.400 --> 07:49.320
I'm going to talk about what the fundamental built-in blocks are in a UI framework and then

07:49.320 --> 07:56.040
draw some parallels with music basically. So, yeah, in a framework like Compose or React,

07:56.040 --> 08:02.120
you have some fundamental blocks, you have layout elements, something like a row to put elements in or

08:02.120 --> 08:07.960
call them. You have the actual UI components, you have buttons, you have text fields and so on.

08:09.480 --> 08:15.960
You have state, state for values that leverage whatever state management your UI framework provides,

08:16.920 --> 08:23.320
you have values that change over time animations. You have effects. So, performing an action whenever

08:23.320 --> 08:29.800
a component enters the tree or leaves the tree. When I say tree in Compose, I mean the composition

08:29.800 --> 08:34.600
in React, in this case it's the DOM basically whenever a component gets mounted or unmounted.

08:35.880 --> 08:40.600
And you also have a need to sometimes make a values accessible from anywhere in the tree

08:40.600 --> 08:47.640
like a theme that you want to apply to every component. So, how does this compare to the requirements

08:47.640 --> 08:55.240
from music? So, rather than layouts and in music, you have timing. You want to play something at

08:55.240 --> 09:00.600
the same time as something else, you want to play it in order. You want to express with them.

09:02.200 --> 09:06.520
You want to play sounds. These are like your UI building blocks like a button or a text field,

09:06.520 --> 09:13.160
you have instead like a spin or playing some samples. You have again state for values,

09:13.160 --> 09:19.800
values that update and change the music as it plays and for these we can leverage the state

09:19.800 --> 09:27.480
management of the framework. You have values changing over time to make the music dynamic and again

09:27.480 --> 09:34.600
you can leverage the animation capabilities of your framework. You need to schedule and cancel

09:34.600 --> 09:41.400
audio. We look at how Dacapo does this in a minute and this is something where we can leverage the

09:41.400 --> 09:48.120
effects that UI frameworks provide. And then you have things that you want to be available

09:48.120 --> 09:54.680
throughout your program like volume control or setting a BPM that's consistent for all the components.

09:54.680 --> 09:58.440
And this is much like the feeling that I mentioned with UI. You want to make some of value

09:58.440 --> 10:14.040
available throughout your program. So we start with managing state. So in Compose we have

10:14.920 --> 10:21.000
something that is a true code state and this is similar to the used state hook in React if you

10:21.000 --> 10:28.360
familiar. Whenever the state changes the Compose tree the composition automatically updates.

10:29.160 --> 10:35.880
So we can basically for example set a synth. We say we want to piano synth and then we

10:35.880 --> 10:40.280
we have a block that actually plays some music using that synth and when we change this value

10:41.480 --> 10:45.480
then the synth will automatically update.

10:51.240 --> 11:03.240
Then to actually make it work. So inside of a like this Dacapo library basically we need to use

11:03.240 --> 11:09.400
something to schedule audio and to cancel it again. So whenever a component gets mounted whenever

11:09.400 --> 11:14.520
anything gets added to the tree we want to schedule something, some audio that wants to be played.

11:15.160 --> 11:21.480
And whenever we remove a component we need to cancel that audio, cancel that scheduling.

11:22.280 --> 11:28.440
And yeah to that end Compose provides something called disposable effect allowing us to run

11:28.440 --> 11:32.680
something when the component enters the tree and something else when the component leaves the tree.

11:33.240 --> 11:36.200
And this is similar again to React's use effect hook.

11:36.520 --> 11:46.120
Yeah then let's look at making values available throughout.

11:49.480 --> 11:53.000
Basically for things like a volume control or setting BPM.

11:54.600 --> 11:57.640
This is where Compose provides something called composition local.

11:58.760 --> 12:01.640
This is similar to React's context if you familiar.

12:02.600 --> 12:07.800
Basically a way to set a value and have it be available anywhere downstream in the tree.

12:09.000 --> 12:14.680
So basically we make a Composable function here for an absolute volume where we set the volume

12:14.680 --> 12:24.520
via the composition local. And you can also set relative volume the same way by reading the

12:25.080 --> 12:30.360
current value of the composition local and modifying it.

12:31.000 --> 12:36.040
And the same can be done for setting BPM for setting a relative tempo that say half as fast as the

12:36.040 --> 12:40.520
current one. So this is where the composition local is quite handy.

12:43.880 --> 12:50.120
Then yeah for instance in samples. So actually I'm building these from the ground up from scratch.

12:50.120 --> 12:59.160
And this is fun. It's also quite educational because I'm not as familiar with some of these concepts

12:59.240 --> 13:02.680
and I'm getting some familiar reality with like fundamentals here which is really nice.

13:03.480 --> 13:06.760
So we shed your sense and samples through effects.

13:09.320 --> 13:13.320
And then they can basically be embedded in other blocks to modify the sound.

13:15.640 --> 13:22.760
We want to shape the envelope of the sound like an ADSR envelope attacked decay

13:23.640 --> 13:29.640
sustained release. And for that what should the ADSR that look like?

13:30.280 --> 13:37.000
So there's the option to basically use blocks and just say okay you embed your synth inside an envelope

13:37.000 --> 13:44.040
block and that gives you an envelope shaped for your synth. But there's something more convenient.

13:44.520 --> 13:55.160
So what compose calls modifiers? So these modifiers basically are used for for padding,

13:55.160 --> 14:01.400
for sizing, for setting a border around a component. And I've we purpose this

14:01.400 --> 14:09.400
modifier concept and I'm calling mine conditioners. So here we can basically say we play a synth

14:09.480 --> 14:16.520
and we condition it with an envelope. And we can do the same for effects like if we have a reverb

14:16.520 --> 14:22.520
component we can nest a synth component in it and then we have a synth with some reverb or we can

14:22.520 --> 14:27.480
use the conditioning to basically achieve the same outcome.

14:27.720 --> 14:42.360
Yeah, then basic timings and playing sounds simultaneously as well as in order.

14:43.160 --> 14:49.720
So time primitive like playing sounds in parallel, cycling over different sounds,

14:50.120 --> 14:55.400
one on each beat or splitting up time equally. So we can express basic timings like this.

14:56.120 --> 15:03.320
We can play a and at the same time we'll play b on the first beat. On the second beat

15:03.880 --> 15:10.120
we'll first play c and then on the next half beat we'll play d. So basically these timing blocks

15:10.680 --> 15:12.120
can give us that control.

15:18.520 --> 15:23.000
Yeah, we can create a drum machine basically just by playing different samples and

15:26.360 --> 15:31.800
again use composition local to to leverage switching out the drum machine for a different one.

15:35.000 --> 15:38.360
We can make more convenient DSL for expressing rhythms.

15:39.240 --> 15:43.240
Something like this. Basically we use strings. We represent

15:46.040 --> 15:53.960
breaks by spaces and on the other beats we play one sound or another by basically specifying which

15:55.640 --> 15:59.320
sample corresponds to which symbol in the string.

16:03.560 --> 16:10.520
There's one more concept that I want to talk about in compose. It's called constraint layout.

16:10.520 --> 16:17.160
So basically rather than laying out components in a grid we can anchor components to other components

16:17.160 --> 16:23.720
and sometimes when we make a complex layout that's more convenient. So we specify that a button

16:23.720 --> 16:30.280
should show at the top of the parent and then a text here should show below the button.

16:31.080 --> 16:37.240
And this is achieved by this constraint layout. I was quite pleased to see that actually

16:37.240 --> 16:41.560
that there's a nice analogy there where something like a constraint layout is quite useful.

16:43.720 --> 16:48.840
So if we're hooking up different, different audio building blocks. Let's say we have some kind of

16:48.840 --> 16:56.120
sine wave generator and we have a delay element that takes an input and applies a delay to it.

16:56.120 --> 17:03.000
We have a reverb that takes an input and applies a reverb to it. Then we can also use the same

17:03.000 --> 17:08.840
approach as with a constraint layout to hook these up to each other to basically say hey the

17:08.840 --> 17:14.840
sine wave generator I will call generator and then the delay accepts input from that generator.

17:14.840 --> 17:24.520
The reverb accepts input from the delay. I see that I'm running out of time here.

17:25.240 --> 17:31.000
So I just quickly say yeah I'm pretty much in this multiple platforms is quite feasible because

17:31.000 --> 17:38.760
16-bit pulse code modulation is well supported but I don't want to reinvent the wheels so I'm

17:38.760 --> 17:43.080
also looking to hook this up to some libraries under the hood something like super collider at

17:43.080 --> 17:49.480
least on desktop but this requires careful abstractions and ensuring that the audio back ends

17:49.480 --> 18:00.040
can support the features consistently. I will at this point I think because we've run out of time

18:01.160 --> 18:09.560
leave it here. Thank you so much for listening. The cover will be available there oops I have

18:09.800 --> 18:19.080
stopped showing slides so about that. Yes I will be available to you all there. If you've done

18:19.080 --> 18:22.440
something similar or have any expertise you want to share please reach out I will be super interested

18:22.440 --> 18:29.560
to discuss you can reach me there as my email and yeah if there's time I would happily take some

18:29.560 --> 18:39.320
questions now. Thank you. Thank you Marilyn and I'm also the next speaker so I won't be

18:39.320 --> 18:44.360
able to run and give the microphone away so is there anybody willing to act as my avatar in the

18:44.360 --> 18:51.400
meanwhile just to bring the microphone to whoever is asking any questions for meddling you can do it

18:51.480 --> 19:03.320
oh take care very much. Any questions? Any questions?

19:04.280 --> 19:18.200
That was a copy what we'll have our presentation. Yeah that was a very happy to be there jean

19:26.200 --> 19:28.440
Thank You

