[Analysis, Partial differential equations] Calculating differential operators on simplicial complexes

Lately I found an interest in solving PDEs numerically. I've done some simulations on rectangular grids(heat equation, wave equation with location dependent velocity etc), and I'd like to do something more sophisticated. My end goal is to perform numerical simulations on curved surfaces, but simple things first.

Flat domain

Given I've got a flat domain discretized into a bunch of triangles with a value associated at each point. I would like to calculate the spatial derivatives at point P given the values at P's immediate neighbors. I figured I could calculate a linear function that interpolates values at vertices belonging to each neighboring triangle, calculate the derivatives based on each interpolant and then do a weighted average. However I'm not sure whether the weight should be with respect to the surface area S, or the edge length l?

Abstracting the ambient space

Is it possible/sensible to express the gradient/laplace operator not in terms of derivatives with respect to the axes of ambient space, but in terms of some sort local barycentric coordinates spanned on the neighboring vertices? Would the differential equation change in any way? E.g would the equation u_{t} = βˆ‡^(2)u remain the same, or would some additional factors show up?

Curved domains

What considerations would I need to take if the domain weren't flat? For example if the domain would approximate the surface of a sphere or a torus? I feel like there's a need to account for the curvature of the surface in the differential operators. This is why I thought about expressing the gradient not with respect to ambient space but with respect to the local space defined by the simplicial complex(should it be a good idea).

All of this is for fun/learning, so one can assume everything is sufficiently nice, smooth and not an edge case.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/GourmetCat
πŸ“…︎ Jan 09 2022
🚨︎ report
Fourier transform operator applied to differential operator by itself

(Forgive me if I didn't tag this correctly, wasn't sure what the best fit was).

For work, I want to implement the CMOS noise reduction algorithm that's in this paper:

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7582543/

I'm having difficulty with this part, immediately preceding equation (8): https://imgur.com/2kQDjDH

  • What does it mean to apply a Fourier transform operator to a differential operator that's by itself, as in the denominator? I would usually interpret a lone differential operator as being applied to an implied "1", but here that would just zero out those terms every time, and that doesn't seem right.
  • Another question: in these lines, it appears that the matrix U is factored out from the left side of the equation. Does this imply that the identity matrix is left behind, and further that U must be square?

Thanks for any input.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/itsthehumidity
πŸ“…︎ Sep 30 2021
🚨︎ report
currently going through differential operators and thought it would be a fun procrastination to replicate a figure from notes that visualises curl. Quite happy with the result haha! thought I'd share. Happy Differentiating/Integrating ya'll
πŸ‘︎ 282
πŸ’¬︎
πŸ‘€︎ u/Aunty_Polly420
πŸ“…︎ May 25 2021
🚨︎ report
Testing a custom ADC shield for a BeagleBone using my teensy and arduino. It uses 3x 12 bit 8 channel ADC’s with single ended and pseudo differential modes.
πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/xthatguy339x
πŸ“…︎ Feb 19 2021
🚨︎ report
Are there any (systematic) ways of finding the differential operator given it's Green Function?

Was wondering this these days, in particular in the context of getting more insight into the Laplace transform, by letting G(t,s) = exp(-st)

πŸ‘︎ 10
πŸ’¬︎
πŸ‘€︎ u/not-alredy-taken
πŸ“…︎ Jul 29 2021
🚨︎ report
Pseudo attempts to differentiate modern day standards of Serbo-Croatian (questions in text)

Croatians - Croatian linguistic purism i.e. making up words such as "zrakomlat", the infamous "mljevenik", etc. Making fun of "dakanje" e.g. acting as if it's grammatically incorrect to say "idem da radim" instead of "idem raditi".

Montenegrins - rejecting cyrillic, the addition of new letters such as "Ś" and "Ź".

Bosniaks - Turkisms, throwing in the letter h, e.g., "lahko", "polahko", etc.

I'm curious, what have Serbs done to differentiate their standard? Who has kept their standard most similar to pre-war Serbo-Croatian? And whose changes are most acceptable, and whose most laughable?

πŸ‘︎ 11
πŸ’¬︎
πŸ‘€︎ u/Supakmeraklija
πŸ“…︎ Dec 03 2021
🚨︎ report
How do I prove this formal adjoins of such a function for a differential operator Z?

I gave my earnest attempt at this question and could not get anywhere

Here is the question:https://ibb.co/nnHgsdd Some Context to the Question I leave there from Passages to the book https://ibb.co/album/BKQyx7

Here is the Attempt:https://ibb.co/album/R4NqKx Please can someone show me the details for this proof please?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/scroo0ooge
πŸ“…︎ Aug 03 2021
🚨︎ report
Are there any good books for the topic on formal adjoints for differential operators?
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/scroo0ooge
πŸ“…︎ Aug 04 2021
🚨︎ report
Had a tanker bar in the drive shaft to keep a nut on the differential from spinning. Operator jumped in the truck without looking to see if was done.
πŸ‘︎ 99
πŸ’¬︎
πŸ‘€︎ u/perhasper
πŸ“…︎ Nov 29 2020
🚨︎ report
[Research] Fourier Neural Operator for Parametric Partial Differential Equations

View the full paper presentation here which includes a time-stamped outline:

Numerical solvers for Partial Differential Equations are notoriously slow. They need to evolve their state by tiny steps in order to stay accurate, and they need to repeat this for each new problem. Neural Fourier Operators, the architecture proposed in this paper, can evolve a PDE in time by a single forward pass, and do so for an entire family of PDEs, as long as the training set covers them well. By performing crucial operations only in Fourier Space, this new architecture is also independent of the discretization or sampling of the underlying signal and has the potential to speed up many scientific applications.

Abstract:

The classical development of neural networks has primarily focused on learning mappings between finite-dimensional Euclidean spaces. Recently, this has been generalized to neural operators that learn mappings between function spaces. For partial differential equations (PDEs), neural operators directly learn the mapping from any functional parametric dependence to the solution. Thus, they learn an entire family of PDEs, in contrast to classical methods which solve one instance of the equation. In this work, we formulate a new neural operator by parameterizing the integral kernel directly in Fourier space, allowing for an expressive and efficient architecture. We perform experiments on Burgers' equation, Darcy flow, and the Navier-Stokes equation (including the turbulent regime). Our Fourier neural operator shows state-of-the-art performance compared to existing neural network methodologies and it is up to three orders of magnitude faster compared to traditional PDE solvers.

Authors: Zongyi Li, Nikola Kovachki, Kamyar Azizzadenesheli, Burigede Liu, Kaushik Bhattacharya, Andrew Stuart, Anima Anandkumar

πŸ‘︎ 234
πŸ’¬︎
πŸ‘€︎ u/Snoo_85410
πŸ“…︎ Nov 24 2020
🚨︎ report
[Calculus 2: Differential equations D operator method]

https://preview.redd.it/xi4s48hk11571.png?width=386&format=png&auto=webp&s=9b3799e9a4e033f33d170146a55bf10dcf19b37d

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/RESTITUTOR_ORBIS_
πŸ“…︎ Jun 13 2021
🚨︎ report
Fourier Neural Operator for Parametric Partial Differential Equations - Paper Explained

This new paper by researchers from CalTech & Purdue is notable for making significant advancements in solving Partial Differential Equations, critical for understanding the world around us. Here is a great video that explains the paper in detail. (You can click the time-stamps on the right to jump between sections.)

Quick Summary:

Numerical solvers for Partial Differential Equations are notoriously slow. They need to evolve their state by tiny steps in order to stay accurate, and they need to repeat this for each new problem. Neural Fourier Operators, the architecture proposed in this paper, can evolve a PDE in time by a single forward pass, and do so for an entire family of PDEs, as long as the training set covers them well. By performing crucial operations only in Fourier Space, this new architecture is also independent of the discretization or sampling of the underlying signal and has the potential to speed up many scientific applications.

More information about the research:

Paper: https://arxiv.org/abs/2010.08895

Code: https://github.com/zongyi-li/fourier_neural_operator/blob/master/fourier_3d.py

MIT Technology Review: https://www.technologyreview.com/2020/10/30/1011435/ai-fourier-neural-network-cracks-navier-stokes-and-partial-differential-equations/

πŸ‘︎ 124
πŸ’¬︎
πŸ‘€︎ u/othotr
πŸ“…︎ Nov 25 2020
🚨︎ report
A constant and e to the x are in a bar and a differential operator walks in

The constant says, "sorry, e to the x, every time he comes in he kills me."

e to the x says, "huh, when I meet him nothing happens."

The constant leaves and the differential operator orders at the bar. e to the x decides to go over and say hi.

"Hi, I'm e to the x," e to the x says to the differential operator.

The differential operator responds, "hi, I'm d/dy."

πŸ‘︎ 48
πŸ’¬︎
πŸ‘€︎ u/ADS_Fibonacci
πŸ“…︎ Nov 18 2020
🚨︎ report
Complementary solution to the formal adjoint of a linear differential operator

I have a linear differential operator and its complementary solution u. Is there a way to show that the complementary solution of the adjoint operator is u/(p(x)*W) where p is the coefficient of the second order differential and W is the wronskian

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/MindlessTie2
πŸ“…︎ Mar 25 2021
🚨︎ report
Fourier Neural Operator for Parametric Partial Differential Equations arxiv.org/pdf/2010.08895.…
πŸ‘︎ 19
πŸ’¬︎
πŸ‘€︎ u/Huang_Yong
πŸ“…︎ Oct 24 2020
🚨︎ report
how do tou pronounce the partial differential operator?

i mean the curved d. some people just call it d, other call it doo/die, others call it partial. what’s the right way to read it?

πŸ‘︎ 38
πŸ’¬︎
πŸ‘€︎ u/DeSteph-DeCurry
πŸ“…︎ Jul 05 2020
🚨︎ report
Shift in Eigenvalues of differential operator

For the differential operator d^y/dx^2 with say dy/dx=0 for x=0 and x=1, we know that the Eigenvalues are \lambda_j=-(j\pi)^2 for j=0,1......

We know that the Eigenvalues of the operator d^2y/dx^2 +ay Are the same as above but shifted by a, when a is constant. However, can we estimate how much they shift when a=a(x)? I’ve looked in the literate and haven’t found an answer. This seems like a classical problem, so I’m shocked that it’s been so hard to find an answer.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/FloatingNode
πŸ“…︎ Oct 15 2020
🚨︎ report
If P(D) is a linear differential operator and y satisfies P(D)y=0, how can we show that xy satisfies P(D)^2 y=0

In my differential equations class we learned that when you have a second order homogeneous ODE and the differential operator is squared, using the substitution y=e^rx only produces 1 solutionβ€”it’s not too difficult to find the other solution (xe^rx) in this case because the order is low, but when we have a higher order operator how do we show that multiplying the solution by x produces a new solution?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/okaythanksbud
πŸ“…︎ Feb 27 2021
🚨︎ report
Some of my favourite 'Pseudo Juggs' looks, love combining the Chainsaw with armored operator skins reddit.com/gallery/iguw1i
πŸ‘︎ 39
πŸ’¬︎
πŸ‘€︎ u/yamuzwaran
πŸ“…︎ Aug 26 2020
🚨︎ report
Learning more about Polynomial/Differential Operators?

Hello r/math.

I'm reading in the book "Ordinary Differential Equations" by Tenenbaum and Pollard, and there's a chapter that introduces differential and polynomial operators.

They represent the derivative d/dx as D, and then you can have a polynomial in D that you can "multiply" with a function. D^2 y means d^2 y/dx^2, and (D+1)^2 y = D^2 y + 2Dy + y etc. The algebraic roots of the polynomial behave as expected in this new world. (They also represent Laplace transform with it, but I know Laplace well and I'm not looking for more on that.)

The method seems very powerful (like the Shift theorem), and it feels like there is a lot more to it than in that book. Especially on the algebra side of it. But I can't find any good references that are easy for me to understand.

From https://en.wikipedia.org/wiki/Differential_operator, I see that Weyl algebra may be the final destination. But that last page looks way too complicated to me.

Also I accidentally ran into a paper that justifies what used to be called "umbral calculus" with another form of "operators". So this stuff must be really good!

I'm comfortable with most undergrad topics, and know groups/rings/fields+ in abstract algebra, and happy to read anything as long as it's not 30 new advanced words for me in every paragraph :).

Do you have a good recommendation for a book/document that builds up that theory of "operators" well, but without assuming, like, advanced graduate stuff?

Thank you :).

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/WobblyKitten
πŸ“…︎ Oct 26 2020
🚨︎ report
Resources on Differential Geometry, (pseudo-) Riemannian manifolds, Differential Forms, General Relativity

I'm looking for some graduate texts on the subjects in the title and preferably some including relevant material on multilinear algebra and tensor analysis. I'm aware of do Carmo's Differential Geometry of Curves and Surfaces as well as Gravitation by Thorne et al. I'd like to supplant the former (whose language is a bit uncanonical) and be equipped to comprehend the latter. I did my undergrad in physics and math, have a good fundamental understanding of general topology, manifolds, and abstract algebra and have done work in special relativity, if that helps.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/tmcopeland
πŸ“…︎ Jan 17 2019
🚨︎ report
[R] Difference between autocorrelation vs adding a differential operator in a GLM

Happy labor day my fellow statisticians. As the title suggests, I’m curious as to what the difference is, if any, between adding an autocorrelation term vs optimizing a dy/dt up to a specific order in a General Linear Regression ? Is one inherently biased or are they the same in terms of the Linear Algebra. Thanks for any insight!

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/smoore0918
πŸ“…︎ Sep 07 2020
🚨︎ report
Fourier Neural Operator for Parametric Partial Differential Equations arxiv.org/abs/2010.08895v…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/ShareScienceBot
πŸ“…︎ Oct 25 2020
🚨︎ report
Kinda struggling with these ODE problems. For each of the following functions, find the LTI differential operator p(D) having it as unit impulse response. I think it would help to read some explanations/approaches from here.

Problem (d) https://ocw.mit.edu/courses/mathematics/18-03sc-differential-equations-fall-2011/unit-iii-fourier-series-and-laplace-transform/unit-step-and-unit-impulse-response/MIT18_03SCF11_ps6_II_s25q.pdf

Soln https://ocw.mit.edu/courses/mathematics/18-03sc-differential-equations-fall-2011/unit-iii-fourier-series-and-laplace-transform/unit-step-and-unit-impulse-response/MIT18_03SCF11_ps6_II_s25s.pdf

I really tried for more than an hour.

I don't understand the part where: "This function has a jump", "This function has no jump but its derivative does", and "This function w(t) has no jump in value or derivative, but its second derivative does jump"

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/superrenzo64
πŸ“…︎ Aug 13 2020
🚨︎ report
Is it like there are more fans for Sadhguru outside Tamilnadu, because Tamilnadu has had a continuous history of spirituality, bhakti and yogic practices and can differentiate pseudo science than rest of India? It seems like many youths (mostly them), outside Tamilnadu are falling for this man.
πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/reddit_gurubhai
πŸ“…︎ Apr 07 2021
🚨︎ report
Another pseudo-HoI IV super event for the Gundam homebrew rpg I'm running. This one pertains to the Australian response to Operation British youtube.com/watch?v=6QDbj…
πŸ‘︎ 20
πŸ’¬︎
πŸ‘€︎ u/Sevchenko874
πŸ“…︎ Oct 02 2021
🚨︎ report
How can we differentiate conversion disorder from pseudo siezure(psychogenic non epileptic seizure?
πŸ‘︎ 2
πŸ’¬︎
πŸ“…︎ May 18 2021
🚨︎ report
Since vacuum cleaners operate by creating a pressure differential between the atmospheric pressure and the internal pressure. Does that mean vacuum cleaners operate less efficiently at higher altitudes?
πŸ‘︎ 40
πŸ’¬︎
πŸ‘€︎ u/nichdos
πŸ“…︎ Sep 18 2021
🚨︎ report
Fourier Neural Operator for Parametric Partial Differential Equations - Paper Explained

This new paper by researchers from CalTech & Purdue is notable for making significant advancements in solving Partial Differential Equations, critical for understanding the world around us. Here is a great video that explains the paper in detail. (You can click the time-stamps on the right to jump between sections.)

Quick Summary:

Numerical solvers for Partial Differential Equations are notoriously slow. They need to evolve their state by tiny steps in order to stay accurate, and they need to repeat this for each new problem. Neural Fourier Operators, the architecture proposed in this paper, can evolve a PDE in time by a single forward pass, and do so for an entire family of PDEs, as long as the training set covers them well. By performing crucial operations only in Fourier Space, this new architecture is also independent of the discretization or sampling of the underlying signal and has the potential to speed up many scientific applications.

More information about the research:

Paper: https://arxiv.org/abs/2010.08895

Code: https://github.com/zongyi-li/fourier_neural_operator/blob/master/fourier_3d.py

MIT Technology Review: https://www.technologyreview.com/2020/10/30/1011435/ai-fourier-neural-network-cracks-navier-stokes-and-partial-differential-equations/

πŸ‘︎ 26
πŸ’¬︎
πŸ‘€︎ u/othotr
πŸ“…︎ Nov 25 2020
🚨︎ report
[D] Paper Explained - Fourier Neural Operator for Parametric Partial Differential Equations (Full Video Analysis)

https://youtu.be/IaS72aHrJKE

Numerical solvers for Partial Differential Equations are notoriously slow. They need to evolve their state by tiny steps in order to stay accurate, and they need to repeat this for each new problem. Neural Fourier Operators, the architecture proposed in this paper, can evolve a PDE in time by a single forward pass, and do so for an entire family of PDEs, as long as the training set covers them well. By performing crucial operations only in Fourier Space, this new architecture is also independent of the discretization or sampling of the underlying signal and has the potential to speed up many scientific applications.

OUTLINE:

0:00 - Intro & Overview

6:15 - Navier Stokes Problem Statement

11:00 - Formal Problem Definition

15:00 - Neural Operator

31:30 - Fourier Neural Operator

48:15 - Experimental Examples

50:35 - Code Walkthrough

1:01:00 - Summary & Conclusion

Paper: https://arxiv.org/abs/2010.08895

Blog: https://zongyi-li.github.io/blog/2020/fourier-pde/

Code: https://github.com/zongyi-li/fourier_neural_operator/blob/master/fourier_3d.py

MIT Technology Review: https://www.technologyreview.com/2020/10/30/1011435/ai-fourier-neural-network-cracks-navier-stokes-and-partial-differential-equations/

πŸ‘︎ 47
πŸ’¬︎
πŸ‘€︎ u/ykilcher
πŸ“…︎ Nov 22 2020
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.