A list of puns related to "Pushforward"
Hello guys,
For a long time I skipped pushforward but I just read it here: https://juliadiff.org/ChainRulesCore.jl/dev/ and it was very clear description. The terms here are frule, rrule, I tried to google in the internet but I didn't get why don't we use pushforward for gradient computation.
Why are we using pullback, is it faster? What are the downside of using pushforward to compute gradients does anyone know?
Is it correct to think that pushforward and pullback maps, in the context of differential geometry, together form the generalisation of change of basis ? Why or why not ?
Say, exf (tensor product) is a (1,1)-tensor, e a vector and f a one-form, on a manifold M and f:M->N is a smooth map. Is it possible to construct a "pushforward" of exf, a (1,1)-tensor on N? What about a pullback via a map f:N->N?
Or is this only possible with "pure" tensors of rank (r,0) or (0,s)?
So I have a smooth map between manifolds u : N -> M, and let Y be a vector field over N, so Y is a section of TN. Now I want to define u_{*}Y (lower star) which I'm struggling with.
I understand u_{*}Y should be a vector field on M, so a section of TM. I also know I can use du : TN -> TM derivative so I think at the end ill have to use this map once I have a vector in TN to arrive in TM. So in the end I will be going M -> TN -> TM I think, but I cant get that first step. I want to use u^-1 but its just a smooth map rather than a diffeomorphism so I was thinking I couldn't do that. I'm sure its simple but I just cant see it. Any guidance?
edit: wiki seems to be saying u_{}Y should be a section of u(TM) pullback bundle over N. I was not expecting that and it seems unintuitive but would sort some confusion in my struggles above. Would be interested in an elaboration of why its a section of u*(TM) over N rather than TM over M
edit^2 : my stars are going weird but it should be clear where they need to be for those that are knowledgeable to answer the question. Apologies
I've taken a course in differential geometry (mostly curves and surfaces in 3D Euclidean space though, not general Riemannian geometry) but upon further independent reading I've realized that I lack the level of understanding I need when it comes to the different kinds/notions of derivations on manifolds (I know the words, but feel as if I don't really get it).
If someone is willing to give an explanation (I'd welcome both general overviews, and deeper more far-reaching things) I'd be very grateful.
(Towards the future (PhD) I'm aiming towards geometric analysis, algebraic geometry, those kinds of things; if that helps with deciding explanatory angles)
The title says it...I feel like I'm so close to grasping what these concepts are supposed to do, but I'm not quite there, and the language we use to study them in class is very formal, which makes it quite hard to understand what's going on intuitively.
Let [; \phi: M \rightarrow N ;]
be a diffeomorphism. Let [; \nabla ;]
be a covariant derivative on [; M ;]
. The push-forward of [; \nabla ;]
is a covariant derivative [; \tilde{\nabla} ;]
on [; N ;]
defined by [; \tilde{\nabla}_{X} Y = \phi_{*} (\nabla_{\phi^{*}(X)} (\phi^{*}(Y))) ;]
, where [; X, Y ;]
are vector fields. Show that, if [; \nabla ;]
is a Levi-Civita connection defined by a metric [; g ;]
on [; M ;]
, then [; \tilde{\nabla} ;]
is the Levi-Civita connection defined by the metric [; \phi_{*} (g) ;]
on [; N ;]
.
I've verified all properties of Levi-Civita connections except the following: [; X \langle Y, Z \rangle = \langle \tilde{\nabla}_{X} Y, Z \rangle + \langle Y, \tilde{\nabla}_X Z ;]
. Any ideas?
Many thanks in advance.
I am currently trying to understand why we find random variables convenient from a more rigorous standpoint. From my understanding, a random variable X is a mapping from the set of a probability space to the set of some other measurable space, usually the real numbers. We are then able to work under this new measurable space because 1) X is a measurable function so it preserves the pre-image from our sample space, and as such 2) we can define a pushforward measure on our new measurable space so that we may completely preserve the probability measure from our probability space.
My question is why exactly is that something we are interested in doing? Based on answers online, is it because the sample space may not be well ordered, so we would like to work in a much βcleanerβ space? This implies to me that we can still perform the same ideas under our original probability space, just a little bit more tedious, or are some things in probability only achievable through random variables? Another side question, is it correct then to call the pushforward measure the probability distribution of our random variable?
Do your worst!
It really does, I swear!
For context I'm a Refuse Driver (Garbage man) & today I was on food waste. After I'd tipped I was checking the wagon for any defects when I spotted a lone pea balanced on the lifts.
I said "hey look, an escaPEA"
No one near me but it didn't half make me laugh for a good hour or so!
Edit: I can't believe how much this has blown up. Thank you everyone I've had a blast reading through the replies π
Theyβre on standbi
Buenosdillas
Pilot on me!!
Dad jokes are supposed to be jokes you can tell a kid and they will understand it and find it funny.
This sub is mostly just NSFW puns now.
If it needs a NSFW tag it's not a dad joke. There should just be a NSFW puns subreddit for that.
Edit* I'm not replying any longer and turning off notifications but to all those that say "no one cares", there sure are a lot of you arguing about it. Maybe I'm wrong but you people don't need to be rude about it. If you really don't care, don't comment.
What did 0 say to 8 ?
" Nice Belt "
So What did 3 say to 8 ?
" Hey, you two stop making out "
When I got home, they were still there.
I won't be doing that today!
You take away their little brooms
This morning, my 4 year old daughter.
Daughter: I'm hungry
Me: nerves building, smile widening
Me: Hi hungry, I'm dad.
She had no idea what was going on but I finally did it.
Thank you all for listening.
There hasn't been a post all year!
Itβs pronounced βNoel.β
Why
After all his first name is No-vac
What, then, is Chinese rap?
Edit:
Notable mentions from the comments:
Spanish/Swedish/Swiss/Serbian hits
French/Finnish art
Country/Canadian rap
Chinese/Country/Canadian rock
Turkish/Tunisian/Taiwanese rap
There hasn't been a single post this year!
(Happy 2022 from New Zealand)
Nothing, it just waved
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.