WEBVTT

00:00.000 --> 00:14.800
Okay, so we're going to continue with the next talk.

00:14.800 --> 00:22.080
Yarek Potchuk is going to tell us a bit about his experience with using UV on Python

00:22.080 --> 00:28.080
mono repos and he's working on the Apache F-Low project which has a huge mono

00:28.080 --> 00:31.080
repo to maintain and I think he's one of the maintainers.

00:31.080 --> 00:32.080
He's going to tell us about this.

00:32.080 --> 00:33.080
Give him a big hand.

00:33.080 --> 00:38.080
Thank you.

00:38.080 --> 00:41.080
Hello, everyone.

00:41.080 --> 00:44.080
I brought something today what you wanted badly.

00:44.080 --> 00:46.080
A little bit of airflow.

00:46.080 --> 00:48.080
Of course.

00:48.080 --> 00:51.080
I'm an Apache airflow PMS member and cometer.

00:51.080 --> 00:55.080
They actually look at the who here knows what airflow is.

00:55.080 --> 00:56.080
How many?

00:56.080 --> 00:57.080
Oh, yeah.

00:57.080 --> 00:58.080
Oh, this is almost everyone.

00:58.080 --> 01:00.080
So I don't have to explain anything.

01:00.080 --> 01:07.080
So I'm on a number one contributor, cometer, in terms of number of comets.

01:07.080 --> 01:10.080
I'm also a member of the Apache software foundation.

01:10.080 --> 01:17.080
Here is our new fantastic logo that we have and sure you can visit our booth at the K Building.

01:17.080 --> 01:23.080
If you want to ask about the foundation, I'm also a member of the ASF security committee.

01:23.080 --> 01:25.080
And I'm going to talk about airflow.

01:25.080 --> 01:29.080
Unfortunately, I will not break the real airflow sorry for that.

01:29.080 --> 01:35.080
Airflow is a data orchestrator and I was thinking for 30 years in a choir.

01:35.080 --> 01:42.080
So for me, the metaphor of explaining what airflow is and like this this decision was made by my wife.

01:42.080 --> 01:45.080
So that you know, people ask me what airflow is.

01:45.080 --> 01:49.080
So airflow is a kind of conductor.

01:49.080 --> 01:53.080
The conductor that directs data processing in data pipelines.

01:53.080 --> 01:57.080
It doesn't do any work like conductors don't do much work.

01:57.080 --> 02:00.080
They just wave and then ball and that's it.

02:00.080 --> 02:02.080
That's that there is an old joke about that.

02:02.080 --> 02:06.080
So airflow is conducting the data processing in pipelines.

02:06.080 --> 02:12.080
It's one of the most or actually it's the most popular orchestrator out there.

02:12.080 --> 02:18.080
We are holding the first place and pretty much any big name that you know is you all those.

02:18.080 --> 02:21.080
You probably know all the logos in there.

02:21.080 --> 02:23.080
Everyone here uses airflow.

02:23.080 --> 02:26.080
We've learned at the last Airflow summit in Seattle last year.

02:26.080 --> 02:31.080
Open AI uses airflow for all the data pipelines training their models.

02:31.080 --> 02:33.080
And that was like okay.

02:33.080 --> 02:34.080
Okay.

02:34.080 --> 02:39.080
If open AI can do it and anyone can do it and that was very good learning for us.

02:39.080 --> 02:43.080
Very quickly since everyone knows airflow here.

02:44.080 --> 02:51.080
The the our pipelines are just tough connected with with some predicates with some conditions.

02:51.080 --> 02:54.080
And you build your pipelines in Python.

02:54.080 --> 02:56.080
Hence the Python the room.

02:56.080 --> 02:57.080
Of course.

02:57.080 --> 03:04.080
And we have this really beautiful UI especially Airflow tree abroad a new UI written in React.

03:04.080 --> 03:08.080
Which is optimized right now to actually manage those pipelines.

03:08.080 --> 03:13.080
So the whole idea is like you have you can have huge pipelines across all the organization.

03:13.080 --> 03:17.080
All the departments thousands of people developing the pipelines.

03:17.080 --> 03:23.080
But then you have can have like one or two people just using the UI to manage those pipelines using the UI.

03:23.080 --> 03:26.080
Because it actually helps them to understand what's going on.

03:26.080 --> 03:29.080
So that's basically the value of Airflow.

03:29.080 --> 03:36.080
So I've been always working as a CICD dev and person in Airflow since I remember.

03:37.080 --> 03:43.080
And for years we had a very complex setup of our development environment.

03:43.080 --> 03:48.080
Especially considering that we were for Airflow 2 which is like five years ago.

03:48.080 --> 03:56.080
We decided to split Airflow into a back then maybe 20 30 separate distribution.

03:56.080 --> 03:58.080
So separate PPI.

03:58.080 --> 04:05.080
PPP install something you could install Airflow and 20 30 modules or distributions.

04:05.080 --> 04:10.080
Right now it's around 100 distributions we have.

04:10.080 --> 04:16.080
And we've always been struggling with how to make the development environment working for this because it was like,

04:16.080 --> 04:22.080
Okay somebody developed just a core Airflow somebody developed just a particular providers as we called them.

04:22.080 --> 04:26.080
So one of those 20 or 100 things right now.

04:26.080 --> 04:33.080
And the whole setup we had didn't, wasn't very quick, wasn't very UX friendly.

04:34.080 --> 04:37.080
Well I developed it so it couldn't be here.

04:37.080 --> 04:39.080
I'm an engineer.

04:39.080 --> 04:45.080
So we developed it initially to very in a very custom way.

04:45.080 --> 04:50.080
Pretty much because there was no support from the packaging ecosystem to do it differently.

04:50.080 --> 04:53.080
So we had to figure out our own way.

04:53.080 --> 04:55.080
Very quick hold up.

04:55.080 --> 05:02.080
So like this talk is all about is based on the blog on a series of blog posts actually four of them.

05:02.080 --> 05:05.080
Oh fantastic almost nothing.

05:05.080 --> 05:06.080
Yeah.

05:06.080 --> 05:08.080
Oh now better.

05:08.080 --> 05:16.080
So I wrote a series of four blog posts and this is the first one which is describing this and it's mostly about that.

05:16.080 --> 05:18.080
Like you can find more more details there.

05:18.080 --> 05:23.080
So first part of this blog post and these presentations of some page we had.

05:23.080 --> 05:31.080
Explanation house standards are helping us now help us to overcome those those pains.

05:31.080 --> 05:37.080
Then a few words about something that helped us even more which is crack.

05:37.080 --> 05:42.080
If you haven't heard about crack I will you will hear of this presentation.

05:42.080 --> 05:47.080
And then the last part is a bit of like innovation we are doing still in this space.

05:47.080 --> 05:53.080
So we still are doing a little bit better than standards allow us and we develop something called shared leaps.

05:53.080 --> 05:58.080
Which we hope maybe eventually we will become something that others can use as well.

05:58.080 --> 06:06.080
So when you have this kind of big modular application you have the dilemma dilemma.

06:06.080 --> 06:12.080
Either you can have like all separate repositories which in our case would mean 120 repositories to manage.

06:12.080 --> 06:16.080
Or you have the giant monolith repository where you keep everything.

06:16.080 --> 06:26.080
And I have always mean the the proponent of monolith and did everything to make it happen but it was like terribly terribly difficult.

06:26.080 --> 06:31.080
We are like decayed old development or our flow more than 700 dependencies.

06:31.080 --> 06:38.080
And we have to build everything in the way that it's easy to contribute to many, many contributors.

06:38.080 --> 06:41.080
Our flow is the most popular ASF.

06:41.080 --> 06:51.080
So a participatory foundation project in terms of number of contributors like we have 3,000 more than 3,600 contributors which is a lot.

06:52.080 --> 06:58.080
So the idea is like how we can make it really frictionless for contributing to such a huge modular base.

06:58.080 --> 07:03.080
And just to give you a little bit of overview of like what kind of difficulties we are dealing with.

07:03.080 --> 07:05.080
So this is just one week.

07:05.080 --> 07:08.080
163 merge pool request in this week.

07:08.080 --> 07:11.080
70 out or 100 30 commits.

07:11.080 --> 07:18.080
So like we have basically every day of every week we have more than 20 PR's merged and that's merged.

07:18.080 --> 07:19.080
Not reviewed.

07:19.080 --> 07:22.080
Like so that's a lot a lot of PR's.

07:22.080 --> 07:24.080
You can see actually me here at the very beginning.

07:24.080 --> 07:27.080
I usually I am usually at the beginning.

07:27.080 --> 07:30.080
Second one is the bot which I wrote.

07:30.080 --> 07:33.080
So like I this counts twice.

07:33.080 --> 07:38.080
And the third one is the ends who actually people mistake with me because he's bald.

07:38.080 --> 07:39.080
So similarly as I.

07:39.080 --> 07:41.080
But he's actually almost 2 meter high.

07:41.080 --> 07:44.080
So like it's very easy not not to mistake.

07:44.080 --> 07:47.080
But the fourth one is also bot I wrote.

07:47.080 --> 07:48.080
Okay.

07:48.080 --> 07:53.080
We have a lot of people contributing I'm just joking.

07:53.080 --> 07:57.080
So the cost of custom code was like we managed 100 distributions.

07:57.080 --> 08:01.080
And that was a serious pain point because because it was completely custom.

08:01.080 --> 08:07.080
Like you had to use the tooling that we had to build the dependencies to build the distributions.

08:07.080 --> 08:10.080
We basically like copied the code, put it somewhere else.

08:10.080 --> 08:17.080
So you generated dynamically by project almost set up by like it was like completely completeness.

08:17.080 --> 08:23.080
And also there was no isolation between proper isolation between those those distributions.

08:23.080 --> 08:36.080
Because we they all live in a single source 3 like single director structure which means that at any point of time anyone could add an import to any other part of the repository and it would work because it was the same.

08:36.080 --> 08:38.080
The repository to even if it shouldn't.

08:38.080 --> 08:44.080
So we had to add some artificial checking if we are using something from this to this director or that directory.

08:44.080 --> 08:46.080
We used precomit for that.

08:46.080 --> 08:49.080
I'll tell about that a little later.

08:49.080 --> 08:53.080
And then the burden was was was was really high.

08:53.080 --> 09:01.080
So fortunately Pi Pi team PIPA team like Python packaging authority.

09:01.080 --> 09:02.080
Yes, that's right.

09:02.080 --> 09:03.080
That's the right Hugo.

09:03.080 --> 09:05.080
Yes, Python packaging authority together.

09:05.080 --> 09:11.080
There is one of the core release manager of Python here last last release Hugo.

09:11.080 --> 09:22.080
So there is a whole Python packaging authority team that was that were relentlessly working to improve the situation.

09:22.080 --> 09:29.080
And I was fighting with them at the very beginning because I have other ideas and they they fought me and like that was a little bit difficult.

09:29.080 --> 09:31.080
But then I understood why they are doing that.

09:31.080 --> 09:41.080
And then finally they did something that we did a lot of things that actually made our life easier even though without it won.

09:41.080 --> 09:51.080
So the PIPA I was focused on standard focused excellence of the ecosystem and I would say like five years ago and now it's like in comparable like.

09:51.080 --> 09:58.080
Lots of standards that we based on our conversion of airflow to little the latest standards.

09:58.080 --> 10:11.080
And also them which we I read there in detail we applied we made sure that we are following them meticulously because we knew that we have a lot of you know baggage and some things were implicit and not really following those standards.

10:11.080 --> 10:16.080
So we just made sure we are following all the standards we can and that helps a lot.

10:16.080 --> 10:24.080
However, there was one thing missing in the standards or not yet not yet implemented because nobody focused on it.

10:24.080 --> 10:32.080
Yet something that we so so the first part it helped us to convert those distributions in we have in the standard like PIPA project.

10:32.080 --> 10:40.080
Tom knows that apply any more all the dependence is nicely defined and all the stuff like it's it's a kind of super modern right now.

10:40.080 --> 10:52.080
That was a more modern more than two years ago, but now we needed something to bring our the distributions together so that we can develop on them separately on each distribution but also we can.

10:52.080 --> 10:59.080
develop on them all together because we want to be able to install all those distribution all hundred of them at the same time in the same virtual environment.

10:59.080 --> 11:10.080
So there was a little bit like it take and have it to like either we want to have a separate distribution but we also want to be able to run them together with other distributions and for example be able to find out.

11:10.080 --> 11:24.080
Whether we have a set of dependencies we have more than 700 of them which allow us to install it everything together without any conflicts which is kind of difficult as you probably know.

11:24.080 --> 11:26.080
So.

11:26.080 --> 11:30.080
Meet UV who didn't hear about UV.

11:30.080 --> 11:35.080
Okay, I thought so like no, maybe you two hands here. So yeah, we adopted UV.

11:35.080 --> 11:54.080
We struggled a little bit with that because UV is owned by by Astral. So it's it's an open source project that the governance is not like PIPA to like hat or or or people or whatever, but like we decided yeah it's okay for us as an opposite of the foundation for development to depend on it.

11:54.080 --> 12:16.080
So we adopted it and we love it and actually the thing is like we will actually work with Charlie Mars and his team on in Astral because the workspace was largely modeled on the needs of airflow because airflow is one of the biggest problem project in open source very well known and one of those who had definitely the need for workspace.

12:16.080 --> 12:22.080
So the initial implementation even had a provider in the documentation which was taken from airflow.

12:22.080 --> 12:26.080
It's gone now like the different different set of documentation structure.

12:26.080 --> 12:50.080
But generally what it is like it allows you to define a worst workspace where you define which of the folders basically in your in your monorepo which distributions in there which each of them is a separate kind of small distribution which of them are part of your workspace and it allows you to do various things within the workspace.

12:50.080 --> 12:56.080
One of them is allows you to install the whole workspace together or run on full workspace together.

12:56.080 --> 13:03.080
But also another very interesting thing is like you can allow distribution in the workspace to refer to another distribution in the workspace.

13:03.080 --> 13:09.080
In a regular requires the pendency requires the standard Python pro.com.

13:09.080 --> 13:19.080
But then it will not install this dependency from PIPA but it will use it from the workspace from the sources in the same monorepo which is like exactly what we need it.

13:20.080 --> 13:27.080
So as you see our directory structure this is like a small fragment of our directory structure.

13:27.080 --> 13:33.080
So we have for example airflow core and airflow CTL those are two distributions we have.

13:33.080 --> 13:38.080
Each of them has its own SRC and test folder each of them has its own PIPA project home.

13:38.080 --> 13:40.080
This is all fully standard.

13:40.080 --> 13:48.080
If you enter to the airflow core directory is just see the subset of the tree and that's a fully flat regular Python project.

13:48.080 --> 13:56.080
But this is all part of monorepo and you see all the blue ones are the same. So like we have lots and lots and lots of them.

13:56.080 --> 14:05.080
So we have 122 or 3 PIPA project home files right now in our monorepo.

14:05.080 --> 14:09.080
So how it looks like how it works. It's very simple.

14:09.080 --> 14:16.080
Like this is like the top root of Apache Airflow repository.

14:16.080 --> 14:21.080
And you are on UVC refresh all packages there are a number of options.

14:21.080 --> 14:27.080
And as you see it looked and we solved all 913 packages we have there a lot.

14:27.080 --> 14:33.080
And you might see me it's long like 10 second but I'll tell you the secret.

14:33.080 --> 14:37.080
This was run on euro star train from London to Brussels with a very weak connection.

14:37.080 --> 14:43.080
Like that's why it took so long normally it takes much much faster.

14:43.080 --> 14:51.080
And the interesting thing that happens and this is something that is really the fantastic feature of that.

14:51.080 --> 14:57.080
If you go to the sub directory which is this Airflow core this is our one of the distributions and we run UVC there.

14:57.080 --> 15:04.080
It actually uses the same virtual environment which is at the top of the root of your workspace which is an interesting one.

15:04.080 --> 15:07.080
And you see it's not 900.

15:07.080 --> 15:13.080
It now uninstals 512 packages from those 900 installed before.

15:13.080 --> 15:24.080
What happens here right now when you run UVC it tells you okay now in my virtual and I want only the things that my this particular distribution needs and nothing else.

15:24.080 --> 15:33.080
So I don't have any other dependencies right now which I don't need as in this distribution because they came from other distributions which we have in our repository.

15:33.080 --> 15:37.080
But but Airflow core doesn't depend on them so doesn't need them.

15:37.080 --> 15:47.080
And this means that your tests your code completion you're all all of the things that you have.

15:47.080 --> 15:58.080
They just see those packages that this particular distribution is so it's exactly as if you had a separate distribution in separate repository and you did just didn't know anything about other things.

15:58.080 --> 16:09.080
Which is quite cool and it's super fast with UV because UV is you know catching a lot and able to switch switch virtual and immediately and recreate them.

16:09.080 --> 16:17.080
And another thing for so this is one of the providers Amazon provider because like we have 90 of them right now.

16:17.080 --> 16:26.080
So if you see what happens when you do you think in the provider it uninstall again many packages and you can see which ones it uninstals.

16:26.080 --> 16:33.080
Because those are all the things that that are not needed by Amazon providers with only only get what that's what you need.

16:33.080 --> 16:41.080
And that allows for example for one thing like when you hear you do in this directory you do UV run by test.

16:41.080 --> 16:49.080
It will actually sink and run by test with all the dependencies that are only declared for this particular distribution and nothing else.

16:49.080 --> 16:58.080
Which means that if you do it on the top level you have everything. If you do it on the directory level you have only this part that this particular provider needs.

16:58.080 --> 17:04.080
And very interestingly for example if we have a dependency between Amazon and Google provider we do.

17:04.080 --> 17:09.080
If we also install the dependencies that the other distribution in the same source three needs as well.

17:09.080 --> 17:14.080
So it automatically resolve all the dependencies cross our distributions as well.

17:14.080 --> 17:25.080
Which is like super cool because then we can run all the tests individually which we do in our CI only like you going to the directory and running UV run.

17:25.080 --> 17:39.080
And this means that if somebody accidentally imports from a distribution from air flow import whatever which was not really cloud as dependency of this particular provider that we are testing.

17:39.080 --> 17:51.080
It will just not run because it will not be there. It will not be installed. It simply and installs also all the source dependence source distributions that are locally available.

17:51.080 --> 17:56.080
It has a very nice like it just surprisingly it just works.

17:56.080 --> 18:04.080
The only thing that we need to do like it even works with IDE setup. So the idea is do not have yet like full support for this one or repo.

18:04.080 --> 18:16.080
But like we develop a simple to script setup idea setup VS code. It just walks through all our distribution creates the right IML file or the VS code. I can't remember what it is.

18:16.080 --> 18:27.080
And it will just get the project definition which has everything you need from all those repositories. So we can also very nicely work with IDE in this way.

18:27.080 --> 18:31.080
But then if it's distribution is isolated.

18:31.080 --> 18:39.080
You have to explicitly declare that one distribution depends on the other in order to use it in order to import this particular distribution.

18:39.080 --> 18:49.080
So which is like super cool super critical for us because we are in the middle of splitting air flow even more isolating things and we just got it out of the box thanks to that.

18:49.080 --> 18:55.080
And it's super flexible because you can very easily switch between distributions and sync things as you want.

18:55.080 --> 19:00.080
So we would love to have a workspace as an industry model and now hat already.

19:00.080 --> 19:08.080
So I work to the effect I even and co-out or of a path which is being long, long in a draft state.

19:08.080 --> 19:19.080
I don't know, long in the not pronounced state yet. But I know of it and he even wanted in he did or the in hats which he owns.

19:19.080 --> 19:27.080
The work space is implemented right now as well. So we can use hat work space implementation which is very very similar in small adult after the UV.

19:27.080 --> 19:40.080
There is no standard yet for that. However, we had some one of one of the things we had a problem we had a precommit bottleneck who is a precommit here for a lot of people yeah.

19:40.080 --> 19:54.080
So we also used school is nice but then single precommit at the top of the repository doesn't work well if you have 90 distribution because you want to have precommit in each of those sub distributions.

19:54.080 --> 20:08.080
And there is this new kit in the blog Prec. I highly recommend is like one to one replacement in Rust. Many times faster uses UV to install everything and working with Joe is fantastic.

20:08.080 --> 20:28.080
Comparing to working with the precommit author who didn't want to accept any of our proposals for years. Joe is very nice and implemented monorepo support for us and it works beautifully. We now have many precommit files precommit config files.

20:28.080 --> 20:39.080
So Prec, Rust base, one to one replacement, it's monorepo aware and it has a number of very very cool features. For example, it supports inline script metadata.

20:39.080 --> 20:54.080
So we can now specify inline script metadata in your scripts and you don't have to declare them as dependencies in your yam file and it will just run any style of your trial environment for each of them using UV which is like super cool we use it immediately.

20:54.080 --> 21:09.080
Yeah, so there are monoreotex integration with UV all these things which are really cool. So we have now more than one precommit config in our report and we will have more because we haven't completed all the migration yet.

21:09.080 --> 21:19.080
How it works, for example, you can go to the provider to one of the distributions pre-cran. It will only run those that are for this particular distribution as you would expect. That's that's beautiful.

21:19.080 --> 21:24.080
And if you're running on top in RAM, it runs all of them of course.

21:24.080 --> 21:35.080
So one thing that we didn't solve before early this year, try. So we have all those distributions.

21:35.080 --> 21:53.080
For example, all our distributions now now we are splitting our flow into, I don't know maybe, 10 more distributions like separate for ui, separate for core, separate for scheduler, separate for figure error, separate for data access or for various reasons including security.

21:53.080 --> 22:02.080
We want them separated, we want to have them really nicely isolated, but lots of them are using the same shared code that we want to reuse across of them.

22:02.080 --> 22:17.080
And the traditional approach is like you build another distribution which is like common kind of which you import like which we depend on like shared logging whatever and you depend on it in both.

22:17.080 --> 22:30.080
And then you have this shared code and that's a new install that dependence. But that has a trap because dry has one trap that not everyone is aware of is namely coupling.

22:30.080 --> 22:45.080
If you do stuff which is dry, then you also couple things together which are using those dry, those dry code, which means that if you install for example one version of your logging library, all your users have to use the same version.

22:45.080 --> 22:59.080
Airflow Python doesn't have like npm has the capability of using libraries different libraries in different versions of the same time. Python doesn't have that. So you only can install one library which have both good and bad sides.

22:59.080 --> 23:13.080
And I think it's a good choice by the way, npm is not a good choice for those libraries not when you have to download half of the internet when you install when you run the same command.

23:14.080 --> 23:23.080
So the problem is that yeah, I'm in the rather loads exactly one version of each library and then you have to like run testing for all the different version combinations.

23:23.080 --> 23:27.080
It's very quickly gets out of the control if you want to use it this way.

23:27.080 --> 23:40.080
So it doesn't really maintain when you for example one to use the same log here or config library in several distributions and you want to be able to freely change the versions of those distributions that you install at the same time.

23:40.080 --> 23:58.080
This is this is like a problem we hand. So we have an inspiration from static libraries from C you probably know static libraries. So their code is embedded in the like once you link the library the code is embedded into this particular library. It's not installed from a shared library.

23:58.080 --> 24:21.080
And Python has something similar like for years Python including people and including all the all the IP I managed things they use the vendoring to vendor either whole libraries or pieces of libraries in if you didn't want to have a library as a dependency just took the code change the module from which it was installed.

24:21.080 --> 24:41.080
It made all the modifications so that imports are working properly even inside and you put it as part of your repository which we didn't want to do manually on staff. So we chose another solution and it's a very interesting one and I hope sometime in the future we will turn it into something reusable.

24:41.080 --> 24:57.080
Our shared code is another this is another distribution which is inside our repository the same way as any other but then it's it symbol symbolically linked to a particular Python package in the package that is using it.

24:57.080 --> 25:15.080
So we are not copying the stuff we are just symbolically linked it in the source directory which is very interesting because like we have to do some small modification in have linked to replace the links with the physical copy during packages but that was like built in feature that we have to configure.

25:15.080 --> 25:42.080
We have to implement some track tools I'll get that to doting them on so track tracks to keep consistency how we are referring to those modules and it works only on certain environments because you know like windows doesn't have directly natively symbolic things so we have to use like you have to use the WSL to but that was not a limitation for our flow because that was always required to use WSL to.

25:42.080 --> 25:52.080
And and it just works as we implemented it just works in the ideas without any modification which was a bit surprising for us.

25:53.080 --> 26:08.080
So what it gives us is like oh you'll be seeing and track hooks they give us automatically the kind of guard lay guard raise that we can check if we have everything synchronized between like for example we have a shared library.

26:09.080 --> 26:29.080
So our blogging library which uses track to log as a dependency this library is synlinked in a new folder inside the distribution that is using it but also dependencies of of this shared library as our synced to that user of the to that distribution that uses it.

26:29.080 --> 26:42.080
I'll show that in the moment and we have some track tracks track hooks which are which are very fine if we are properly symbolically linked and refer to those this shared distributions.

26:43.080 --> 26:50.080
So how it works so we have a configuration like air flow chart configuration.

26:52.080 --> 27:03.080
So that's the definition of shared configuration directory in the shared folder or listener so that's so those are the three example shared libraries that we use in our code.

27:03.080 --> 27:19.080
And how we use them those are just symbolic links that refer to those those configuration and they are in two places so those are two users of those libraries one of them is linked in air flow underscore shirt and another and the second is in air flow SDK shared so.

27:19.080 --> 27:33.080
The same library linked in a different folder in different package which means that when those two distributions and are installed at the same time they're using library which was valid at the time of preparing of that package because that was used from the sources.

27:33.080 --> 27:48.080
It was imported from one folder and the other can use the same library but maybe in a different version maybe in a version that was like week later or two weeks later but it is imported under into different package and they can be both used together.

27:48.080 --> 27:53.080
Yeah so we have so we're using there are some configuration that is needed for.

27:54.080 --> 28:08.080
Piper project on will without going into details all this is kind of automatically synced by our practice so we make sure that all the all the distributions are are used and we think to the future is modular so we found out that it really works for us very well.

28:09.080 --> 28:12.080
And I hope in the future we can have a standard out of that thank you very much.

28:12.080 --> 28:22.080
Thank you very much for the nice talk and fortunately we are out of time for questions so.

28:23.080 --> 28:24.080
I'll be around.

