A list of puns related to "Commutative Algebra"
Here are some (relatively) recent books on commutative algebra beyond the usual resources of Atiyah-Macdonald, Eisenbud, Matsumura, or Reid. I have not read these, but I do want to bring them a bit more attention as they could be quite useful for those trying to get into CA/AG.
And for the most recent one, which doesn't even have reviews yet, here is
Additionally, some free notes on the subject:
The following also cover some algebraic geometry in what seems to be a similar vain as the classic text on CA+AG of Kunz.
Hello reddit!
I wonder if theres a way to make an introductory course in commutative algebra for school kids in my local math school for gifted kids. I'm reading Atya & McDonald book and having a great time, I'm sure these kids are able to understand first few chapters. But this book is kind of dry and I i would like to add some connections to other areas of math and some applications. So my question is: where can i find these connections and applications that do not rely on very advanced math?
We're doing modules in this particular order:
Definition of a module
Definition of a submodule
Definition of a quotient module
Definition of a generated submodule
Definition of a (homo)morphism of modules
Definition of a cokernel
Definition of the linear independence of a subset for a module
Definition of a module's base
Definition of a free module
Definition of the rank of a module
Definition of an exact sequence
Definition of a zero divisor of a module
Definition of an anihilator
Definition of a simple module
Definition of a quotient/product of modules
Theorems of isomorphisms of quotients/products of modules
Definition of a direct sum of submodules
Definition of a coproduct of a family of modules
Definition of a product of a family of modules
Definition of a projective module
Nakayama Lemma
Definition of a chain of modules
Jordan-Holder Theorem
Definition of a ring/field of fractions
Ring localization theorem
Definition of an extended ideal
Definition of a contracted ideal
Module localization theorem
Definition of a tensor product
Theorem about the existence of a tensor product
Definition of a simple tensor
I hope I translated it properly haha.
The reason I ask is because all of that was done in our last two lectures where he had to rush through the presentation and had no time to explain the motivation, intuition and interpretation of all of that.
I'm looking forward for your suggestions
Hello math reddit! Iβm a first year graduate student taking my first commutative algebra course this semester. We are using Eisenbud, but I personally find it dense and somewhat difficult to follow. I was hoping to find other references to help me because I am really beginning to struggle with this class. If you guys could recommend any books, online lecture notes, or even youtube videos, Iβd really appreciate it! Thanks in advance.
I donβt need a proof, but Iβm just a little confused about the statement. If a ring has unity 1, then x1=x. But if x is a zero divisor, then x1=0 for x=/=0, so there wouldnβt exist a unity. Am I thinking about this correctly ?
That's a subject I'm gonna have in the next semester and I'd like to prepare myself a little bit for that
What books are good on this topic? What are in general your tips and advice for this subject?
I've just started to work through the book by Sturmfels and Miller in preparation for my PhD and I'd be interested In someone to discuss concepts with and bounce ideas off
Essentially, I was asked to prove that in a more specific context, the canonical map f(x β¨ y) = xy is not injective. It is, however, surjective in general. This doesn't make much sense to me, though, since the tensor product is taken over R so any simple tensor x β¨ y can be written as xy β¨ 1 so that any element of I β¨ I can be written as βx_i y_i β¨ 1 for elements x_i, y_i of I. Wouldn't the map f which is given here by x β¨ 1 maps to x*1 = x define an isomorphism, then? I am very confused.
Thank you in advance for all your help.
I'm doing a few heavily algebraic courses right now (AG,ANT,RT), and they draw on a very wide variety of results from commutative algebra.
I find it hard to remember basic properties and their implications for rings and modules, so I'm always spending time looking them up. Things like which types of rings are PID/UFD/local/integrally closed, which modules are finitely generated/free over which types of rings, how these pass through quotients, etc. Are there mnemonic devices or similar that people find helpful for this? Besides just doing it for enough years to remember everything.
Noether's Normalization Theorem: (or a version of it I believe)
Let F be a field. If F is a finitely generated K-algebra then F is an algebraic extension of K.
Let's take a simple example. Let F be the Q-algebra finitely generated by {1,π}. I think that's Q(π) right? Then the theorem implies Q(π) is an algebraic extension of π. Which it isn't. Therefore I made a mistake.
Is the mistake that I don't know the definition of finitely generated Q-algebra? It should the smallest Q-algebra containing 1 and π. But since Q-algebras allow products between any elements including π, then the (Q-algebra finitely generated by {1,π}) should contain 1,π,π^(2),π^(3),...,. Implying it should be equal to Q(π). The rest of the argument should follow.
EDIT: IT ISN'T A FIELD. NEVER MIND.
Greetings! I am looking for (self-motivated) people who would be interested in studying commutative algebra and algebraic geometry. I am planning to focus on 3 books. 1)Atiyah and Macdonald for commutative algebra 2) Fulton- Algebraic curves 3) shafarevich basic algebraic geometry
My aim is to do most of the exercises from these books(something that I have not done in the past). Also, we can work together on some expository works as well so that reading is not completely dry.
I feel studying together (in terms of having discussions and exchange of ideas) is much more better than keeping everything to yourself. Also, I would like to make special request to persevering, interested guys to join me. It will also include some accountability to finish things and not take a very relaxed approach (which creeps in when you are studying on your own). Thanks!
In Linear Algebra it's typical to have linearly dependent set and reduce it to a linearly independent set. Here I want to reduce an algebraic dependent set to an algebraic independent one. Is there such a reduction algorithm?
I'll give an example of what I mean in linear algebra.
For example, imagine we have a vector space. And a set of vectors:
S={ (2,0), (3,0), (0,1)}.
We'd like to reduce this set, element by element, until we arrive at a linearly independent set. But in a non stupid way. An example of a stupid way would be to remove elements until we arrive at a linearly independent set. For example, first we could remove (0,1) from S. But S(0,1) isn't linearly independent so we'd remove (3,0). Obtaining {(2,0)}.
There are non stupid ways to reduce S. Pick v in S and ask: "Do <S\v> and <v> intersect non-trivially?". If yes then remove v, if not keep v. Repeat this until S is linearly independent.
This is because in a sense "Do <S\v> and <v> intersect non-trivially?" is equivalent to asking "Are S\v and v independent in linear sense?".
............................................................
Is there a similar concept in algebra? One example would be to ask: "Do the algebraic closure of K(S\v) and of K(v) intersect non-trivially?". By intersecting trivially we mean that the intersection is K.
Does this work? Is this independent of the way in which we reduce our set? It'd be odd if a reduction had cardinality 3 and another reduction had cardinality 4.
Say that a commutative ring R is coherent if every finitely generated ideal in R is finitely presented. (I.e., if I is a finitely generated ideal of R, then there is an exact sequence R^m -> R^n -> I -> 0.)
Claim: Every commutative ring is coherent.
Proof: Let R be a commutative ring, and let I be an ideal of R, say generated by f*1, ..., fk. Any commutative ring is a direct limit of Noetherian rings Rπ: R is the direct limit of subrings finitely generated over Z (namely rings of the form Z[x1, ..., xi] for x1, ..., xi* in R), and such rings are Noetherian by the Hilbert basis theorem. Moreover, there is some such subring which contains f*1, ..., fk, namely Z[f1, ..., fk], and in fact the set of such subrings of R is cofinal in the set of all subrings finitely generated over Z. (If S is a subring of R finitely generated over Z, then so is S[f1, ..., fk*].)
But now Noetherian rings are obviously coherent, so if R*π* contains f*1, ..., fk, then there is some exact sequence Rπ^m -> Rπ^n -> (f1, ..., fk)Rπ* -> 0. But now taking direct limits is exact and direct limits commute with direct sums, so this gives an exact sequence R^m -> R^n -> (f*1, ..., fk*)R -> 0, so indeed the ideal is finitely presented and R is coherent. QED
However, the claim is not true (see http://math.stanford.edu/~vakil/216blog/incoherent.pdf), so of course the proof is also not true. Can you find the mistake? Can you add (nontrivial) hypotheses on the claim to make the (main idea of the) proof work?
Iβm reading Aatiyah Macdonaldβs commutative algebra text and I have no clue which exercises to do. Does anyone have a link to a class which has used this as the main text with exercises assigned from the book or any recommendations on exercises to pick?
I need help with the following. Can someone explain where the Ξ¨ went?
Let G be a commutative monoid, and x1, ... , xn elements of G. Let Ξ¨ be a bijection of the set of integers (1 , ... , n) onto itself. Then
The chapter was pretty intuitive up to this point, but now I'm lost.
This semester I am taking a commutative algebra course at my university, and I find it ridicilously hard. I have all the prerequisities, but still I find it very tough. We use Miles Reid's book, and also some parts of Introduction to Commutative Algebra by Atiyha and MacDonald. However, I find neither book to be very reader-friendly, and as I prefer self-study, I am pretty much lost...
Do any of you have any advice? I mean, study notes, other books, just anything.
To put things into perspective, I find chapter 1 of Miles Reids book very hard, particularily the classification of the maximal ideals in Z[x], and many other things. I got an A in the Algebra course offered at my uni, so I know fairly well general stuff such as groups and basics about fields and rings.
(Edit: Typos)
Hello everyone!
I am currently studying the construction of the field of fractions of a commutative ring.
We saw that basically the construction is inspired by the construction of Q from Z. This justifies the equivalence relation that we consider and the operations we define on this field.
We constructed the field with a morphism from R to RxD where R is the ring in question and D is a subset that doesn't contain ant zero divider.
I think I understand the construction but I can't grasp why we necessarily have to take D as a subset that doesn' contain any zero divider. I get that if we come back to the analogy with Q, this would be tantamount to dividing by zero but when I stay in the abstract construction I can't figure out where the problem is!
Thank you for your help!
Edit: Change 'Field of fractions' for 'Ring of fractions' in the title
Let (A, *_1) and (B, *_2) be two binary structures. We let f: A → B be an isomorphism. Prove *_1 is commutative if and only if *_2 is commutative.
I'm not sure of how to go about proving this. This is what I have thus far:
*_1 is commutative → *_2 is commutative
f(a *_1 b) = f(a) *_2 (b)
f(b *_1 a) = f(b) *_2 (a)
In computer graphics, let's imagine that I have a matrix, P, of points that make a shape, such that P is:
X1 X2 X3
Y1 Y2 Y2
I can multiply P by another matrix, S, such that S x P results in the shape being resized. If f is the factor by which to resize, then S is:
f 0
0 f
I can also multiply P by a matrix, R, such that R x P results in the shape being rotated. If a is the angle by which I want to rotate the shape, the R is:
cos(a) -sin(a)
sin(a) cos(a)
If I want a composite matrix which both resizes or rotates the shape, it seems like I can create that composite matrix by either calculating S x R, or R x S, i.e matrix multiplication is commutative.
But how true is that, generally? Obviously it will only work for matrices where the width of one is the same as the height of the other and vice versa, but my intuition is telling me that it wouldn't be true for all square matrices. Is there are way to know when it's true? If it's not true, is there a way to know in which order the matrices should be multiplied?
Thanks!
Would really appreciate any help on this, thank you.
I once audited a classical AG course so I knew that commutative algebra (CA) was quite heavily used in AG. I am also going to take an advanced comm. alg. course in the fall, given that if I could finish Atiyah & MacDonald during the summer.
I'm interested in number theory, graph theory, and algorithms. From what I have studied, these fields usually do not involve a lot of algebra (other than the first one), so I was wondering that (from a higher perspective) what was usually the role of CA in modern mathematics? Would it help my research in any way if I study these in grad school?
Thanks!
Update: Looks like we've a satisfactory number of people to form the group. Now, the next things to decide are:
Let's discuss about these. Most importantly, we need a platform ASAP.
Update 2:
Hello again, everyone! Seems like we have two choices for the group: slack and subreddit. Personally slack seemed more feature rich and robust to me. So, I've already created our slack domain commalg.slack.com. PM me your email so that I can send you an invitation. A post to discuss our knowledge of abstract algebra so that we can get everybody on the same page is already there.
I'm mentioning here whoever has shown interest till now, PM me your email. /u/RidderJanssen /u/murpwp /u/Haranu /u/JSURATA /u/ThisIsMyOkCAccount /u/CraftyBarbarianKingd /u/yoloed /u/Alozzk /u/MatheiBoulomenos /u/marcelluspye /u/PupilofMath /u/SourAuclair /u/sagarcia11 /u/gshfr /u/CliffordAlgebra /u/tactics /u/dogeatsmoths /u/NullStellen /u/Bobech /u/prime_idyllic /u/AseOdin Hope I've not missed anybody. PM me if you happen to see this post just now and you're interested.
Possibly, we'll be needing one more team admin at slack. And Finally, if slack doesn't work for us, by today, we'll land on a subreddit.
Look at this image of the dot product visualization: https://i.imgur.com/xJ2RUkB.png
The left part is what I found in the wolfram website, but I don't see why the right couldn't also be right. I know the dot product is commutative, but why does this picture make it look like it isn't?
I feel like I'm missing something pretty basic here. Any help is appreciated.
So as you can see from my username, I made this account because of how frustrated I am with this problem: Show T(Z/nZ) is isomorphic to Z[x]/(nx), where T(M) is the tensor algebra of M.
I know the elements of the ring Z[x]/(nx) are polynomials a_0 + (a_1)x + (a_2)x^2 +...+ (a_n)(x^n) where a_0 is in Z and the a_1,...,a_n are in Z/nZ.
Going from the definition of a tensor algebra, I think T(Z/nZ) is Z β (Z/nZ) β (Z/nZ β Z/nZ) β (Z/nZ β Z/nZ β Z/nZ) β... If I'm not mistaken, Z/nZ β Z/nZ = Z/nZ, so would this just be Z β an infinite number of copies of Z/nZ?
This is from a problem set I've already turned in but I'd still like to figure it out so any hints would be appreciated. Thanks much!
The functions satisfy boundary conditions. At the boundaries I could show they're commutative, but what about inside the boundaries? I'm out of ideas.
A question is asking for an example of a ring R and a prime ideal p such that the localisation of R at p is a field. I said the integers with the zero ideal, is that correct?
I am interested in working through some parts of R. Stanley's "Combinatorics and Commutative Algebra" (ISBN: 978-0817643690).
Currently I have taken 2 courses in abstract algebra (Group Theory, Ring Theory and Galois Theory), 2 courses in real analysis, a course in general combinatorics, a course in graph theory, linear algebra, the calculus sequence, intro to proofs, and other assorted math electives.
I have skimmed through some parts of the text and there are quite a number of terms that I am not familiar with. I'm not sure what a module, a chain complex, a simplical complex, a homology, or a resolution are, and they seem to appear frequently.
Are there any texts / materials online that would help prepare me to tackle this?
Sorry for the wall of text and thanks very much for any suggestions.
Regarding radicals, we have the rule:
sqrt(x^2 ) = abs(x)
For example:
sqrt((-3)^2 ) = sqrt(9)= 3
However, exponents are considered to be commutative:
(x^2 )^3 = x^6
(x^3 )^2 = x^6
therefore (x^2 )^3 = (x^3 )^2
Therefore, isn't the following situation possible:
sqrt(x^2 ) =
(x^2 )^1/2 = (x^1/2 )^2
Using our first example:
((-3)^1/2 )^2 =
(i*3^1/2 )^2 =
(-1)(3) = -3
Why bother with absolute value? Why not +/- like how quadratic equations are solved? So sqrt(9) = +3 or -3.
There are two MOOCS coming up on the Indian MOOC platform NPTEL. One on Linear Algebra and the other on Commutative Algebra, both fairly advanced at probably the graduate level. Linear Algebra one seems to be the first of a series.
https://onlinecourses.nptel.ac.in/explorer/search?category=MATH_SCIENCE
Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.