A list of puns related to "Koopmans' theorem"
I have some trouble with proving the koopman theorem. The question is in this pdf. I already solved the functional.
Cheers!
Hello everyone!
I have a question I was hoping that the community would be able to help me out with. My own research is almost completely about kernel functions. I did my PhD in pure mathematics, where I studied densely defined operators over a variety of classical kernel spaces and even made some of my own (hello! Polylogarithmic Hardy space!) After I graduated, I have been working in approximation theory and numerical analysis with engineers, and recently came back to operator theory through the study of Koopman operators and Dynamic Mode Decompositions.
Reading some textbooks by big guys in the field, I notice that Steve Brunton, for instance, makes almost no mention of kernels in his textbook, Data Driven Science and Engineering, and through my conversations with engineers over the years, there might be some nod to the Gaussian RBF, but then it's all about deep learning.
I have always been able to find new and interesting perspectives on kernel functions for learning theory, and a lot of these innovations are really just twists on ideas from 40 or 50 years ago (thanks to the great Wabha!). I feel that there is still a lot more life in that subject. However, as far as I can tell, most of my colleagues are of the opinion that kernels are something that were concocted to do some esoteric classification methods with SVMs and to perform inner products in feature spaces, and are otherwise unaware that kernel spaces were central to things like Shannon's theorem and many other classical topics.
Have we abandoned kernel functions for deep learning? Is there a good reason why people don't use kernels that I'm just missing? I'd be interested in hearing everyone's perspective.
I don't want to step on anybody's toes here, but the amount of non-dad jokes here in this subreddit really annoys me. First of all, dad jokes CAN be NSFW, it clearly says so in the sub rules. Secondly, it doesn't automatically make it a dad joke if it's from a conversation between you and your child. Most importantly, the jokes that your CHILDREN tell YOU are not dad jokes. The point of a dad joke is that it's so cheesy only a dad who's trying to be funny would make such a joke. That's it. They are stupid plays on words, lame puns and so on. There has to be a clever pun or wordplay for it to be considered a dad joke.
Again, to all the fellow dads, I apologise if I'm sounding too harsh. But I just needed to get it off my chest.
Hello everyone!
It's been a bit of a while since I've really made a solid post about this, but I have been continuing my exploration of Dynamic Mode Decompositions. If you don't know what that is, then it is a machine learning technique for studying unknown dynamical systems from collections of their trajectories. The method intertwines operator theory, data science, and dynamical systems theory, and also intersects a good deal with control theory. It is a great little place to explore new operator theoretic methods from a pure mathematical standpoint, and to use operator theory tools to answer questions in machine learning.
I have an introductory lecture here, for those that are interested: https://youtu.be/_qjSprLvGS0
My colleagues and I have been picking apart the field for the past two or three years, and over the last year or so we have made a lot of significant strides. There has been a lot of work surrounding DMD that uses Koopman operators, which play very well with Ergodic theory. That means if you were to restrict yourself to L^1 or L^p, then there is some hope of recovering at least one eigenfunction from the Koopman operator through the Birkhoff and Von Neumann Ergodic Theorems. This is largely the approach the Igor Mezic and his colleagues have taken in the exploration of "Koopmanism."
However, Koopman operators truly correspond to discrete time systems, and necessarily, if you are going to study a continuous time system using those operators, then you need a system that is discretizable. This means you need a system that is forward invariant or forward complete (where the former is a statement about sets and the latter about dynamics). One approach to verify forward completeness is to have a global Lipschitz condition, which means that your system would need to be bounded globally by a linear system.
We just had a paper accepted to the Journal of Nonlinear Science (arXiv: https://arxiv.org/abs/1910.03977), which uses occupation kernels and Liouville Operators (a generalization of Koopman generators) to access the continuous time dynamics directly, without worrying about discretizations. This means that we can study a wider class of dynamics with our methods, and can leverage observed data that comes from systems that admit finite escape times, such as x' = 1+x^2 (i.e. tangent functions). I give an outline of the method here: [https://youtu.be/xfZG0m
... keep reading on reddit β‘Do your worst!
I'm surprised it hasn't decade.
For context I'm a Refuse Driver (Garbage man) & today I was on food waste. After I'd tipped I was checking the wagon for any defects when I spotted a lone pea balanced on the lifts.
I said "hey look, an escaPEA"
No one near me but it didn't half make me laugh for a good hour or so!
Edit: I can't believe how much this has blown up. Thank you everyone I've had a blast reading through the replies π
It really does, I swear!
Hello everyone!
If you have been following my posts, then you will know that I've been posting the lectures for the classes I've been teaching during the pandemic up on YouTube at http://www.thatmaththing.com/. This includes my class on Data Driven Methods for Dynamical Systems.
One of the big thrusts of this class has been to demonstrate my group's perspective on problems like the SINDy algorithm and Dynamic Mode Decompositions. For the former, we showed how integrals can be used to not only gain robustness against sensor noise, but also to define a data driven inner product space on collections of nonlinear dynamics through their Liouville operators over particular RKHSs.
For Dynamic Mode Decompositions, we have been taking this a lot further. Our approach has been completely independent of most of the rest of the literature, where we lean on function theoretic operator theory, Liouville operators, and reproducing kernel Hilbert spaces in lieu of Ergodic Theory and Koopman operators. The tools developed using this perspective are more flexible and cover a broader range of nonlinear dynamical systems. Moreover, it gives us proper convergence theories for DMD that aren't available through the Koopman formalism.
I'm working on a few videos diving into the convergence theory right now, and I am trying to hash out the story here. Maybe you guys can help me frame this?
To make the advantages my group's perspective clearer, let's talk about what we need from our dynamics for the Koopman/Ergodic Theory perspective. Koopman operators are composition operators that are defined with respect to the discretization of a continuous time dynamical system. Koopman operators are operators posed over function spaces, where members of these spaces are called observables. If we have dynamics that are not forward invariant, dynamics such as \dot x = 1+x^2 which gives tangent functions with finite escape times (vertical asymptotes), then the resultant discretizations of the dynamics will have holes in them. This is problematic, where holes in the composition function
... keep reading on reddit β‘Theyβre on standbi
Pilot on me!!
Nothing, he was gladiator.
Dad jokes are supposed to be jokes you can tell a kid and they will understand it and find it funny.
This sub is mostly just NSFW puns now.
If it needs a NSFW tag it's not a dad joke. There should just be a NSFW puns subreddit for that.
Edit* I'm not replying any longer and turning off notifications but to all those that say "no one cares", there sure are a lot of you arguing about it. Maybe I'm wrong but you people don't need to be rude about it. If you really don't care, don't comment.
When I got home, they were still there.
What did 0 say to 8 ?
" Nice Belt "
So What did 3 say to 8 ?
" Hey, you two stop making out "
I won't be doing that today!
[Removed]
This morning, my 4 year old daughter.
Daughter: I'm hungry
Me: nerves building, smile widening
Me: Hi hungry, I'm dad.
She had no idea what was going on but I finally did it.
Thank you all for listening.
You take away their little brooms
There hasn't been a post all year!
Why
Itβs pronounced βNoel.β
After all his first name is No-vac
What, then, is Chinese rap?
Edit:
Notable mentions from the comments:
Spanish/Swedish/Swiss/Serbian hits
French/Finnish art
Country/Canadian rap
Chinese/Country/Canadian rock
Turkish/Tunisian/Taiwanese rap
There hasn't been a single post this year!
(Happy 2022 from New Zealand)
Nothing, it just waved
Him: I can explain everything!
(It's his best joke yet I think)
Bob
So that I could frequently say, "I am going to walk 5 miles now."
Edit: My most popular post on Reddit! π Thank you for the awards.
Just to clarify, 12345678
Me grabbing a soda from my (what I thought was) half full 12pk...
Notices there's only 2;
Me: "Awe man... This is a damn bird box!" Her: "What the hell does that mean?!" Me: (Pulls both cans out & shows them to her) "It's only got Toucans."
I'm not ashamed to admit the look on her face was glorious.
I was just sitting there doing nothing.
Hello everyone!
It's been a bit of a while since I've really made a solid post about this, but I have been continuing my exploration of Dynamic Mode Decompositions. If you don't know what that is, then it is a machine learning technique for studying unknown dynamical systems from collections of their trajectories. The method intertwines operator theory, data science, and dynamical systems theory, and also intersects a good deal with control theory. It is a great little place to explore new operator theoretic methods from a pure mathematical standpoint, and to use operator theory tools to answer questions in machine learning.
I have an introductory lecture here, for those that are interested: https://youtu.be/_qjSprLvGS0
My colleagues and I have been picking apart the field for the past two or three years, and over the last year or so we have made a lot of significant strides. There has been a lot of work surrounding DMD that uses Koopman operators, which play very well with Ergodic theory. That means if you were to restrict yourself to L^1 or L^p, then there is some hope of recovering at least one eigenfunction from the Koopman operator through the Birkhoff and Von Neumann Ergodic Theorems. This is largely the approach the Igor Mezic and his colleagues have taken in the exploration of "Koopmanism."
However, Koopman operators truly correspond to discrete time systems, and necessarily, if you are going to study a continuous time system using those operators, then you need a system that is discretizable. This means you need a system that is forward invariant or forward complete (where the former is a statement about sets and the latter about dynamics). One approach to verify forward completeness is to have a global Lipschitz condition, which means that your system would need to be bounded globally by a linear system.
We just had a paper accepted to the Journal of Nonlinear Science (arXiv: https://arxiv.org/abs/1910.03977), which uses occupation kernels and Liouville Operators (a generalization of Koopman generators) to access the continuous time dynamics directly, without worrying about discretizations. This means that we can study a wider class of dynamics with our methods, and can leverage observed data that comes from systems that admit finite escape times, such as x' = 1+x^2 (i.e. tangent functions). I give an outline of the method here: [https://youtu.be/xfZG0m
... keep reading on reddit β‘Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.