Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2009 April 3

From Wikipedia, the free encyclopedia
Mathematics desk
< April 2 << Mar | April | May >> April 4 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.


April 3

[edit]

Integral comparison text for a rather ugly function

[edit]

Hi there - I'm getting a bit of practice with the integral comparison test, and having previously, by the integral test, investigated the convergence of and , I'm now being asked to determine whether converges - but other than possibly pulling an xlog(x) out of the bottom, I really can't see any way to proceed - nor can I see a comparison with another series which converges or diverges to give me my result. Could anyone give me a hint?

Thanks very much,

Spamalert101 (talk) 02:14, 3 April 2009 (UTC)Spamalert101[reply]

I think you are supposed to notice that n^(1/n) is neither small nor large, so can be ignored. More specifically if n ≥ 3, then 1 < n^(1/n) < cbrt(3) < 3/2, so you can compare 2/3 1/(n*log(n)) < 1/(n^(1+1/n)*log(n)) < 1/(n*log(n)). JackSchmidt (talk) 03:18, 3 April 2009 (UTC)[reply]
Compare it to the first series you mentioned. — DanielLC 04:05, 3 April 2009 (UTC)[reply]

If you've already concluded that

diverges, then you could look at the limit of the ratio of the terms of that series to those of

If the limit of that ratio is a finite strictly positive number, then either both series converge or both diverge.

Also, notice that in TeX one should write "n \log n", with a backslash in front of "log". That not only (1) prevents italicization of "log", and (2) results in proper spacing before and after "log", but also (3) when TeX is used in the usual way, as opposed to the way it's used in Wikipedia, prevents line brakes between "log" and the succeeding "n". Michael Hardy (talk) 17:17, 3 April 2009 (UTC)[reply]

Actually, TeX can only break a formula at the outmost bracing level after a relation symbol, a binary operation symbol, an explicit penalty, or a discretionary (from \- or \discretionary). There is no admissible break-point between two ordinary symbols, such as in . You are of course right that one should write in this case, but not because of line breaks. — Emil J. 12:42, 6 April 2009 (UTC)[reply]

mathematics-groups and rings

[edit]

relations and type of relations —Preceding unsigned comment added by Lakshmibala (talkcontribs) 09:19, 3 April 2009 (UTC)[reply]

"If your question is homework, show that you have attempted an answer first, and we will try to help you past the stuck point. If you don't show an effort, you probably won't get help. The reference desk will not do your homework for you." yandman 12:05, 3 April 2009 (UTC)[reply]
This is hardly a specific question. See group, ring, relation, and some types of relations are listed in binary relation. — Emil J. 14:07, 3 April 2009 (UTC)[reply]

Really easy question

[edit]

This is probably a really easy question, but for some reason I can't figure it out! It's a bit embarrassing but here goes; I'm studying directional derivatives in Calculus, and I have two problems- the first one is to find the rate of change of at the point . Now I have to measure the rate of change in the direction and in the direction . Now ironically I understand this completely, and the answers from the first and second question are and respectively. Now this is where I'm stuck, the book simplifies the first answer as and the second answer as . How does it reach that conclusion? Don't bother to point out how easy this is, I've been beating my head with the keyboard for not getting this --BiT (talk) 15:29, 3 April 2009 (UTC)[reply]

. —JAOTC 15:46, 3 April 2009 (UTC)[reply]
Aaah.. thank you. Please don't tell my teacher. --BiT (talk) 15:56, 3 April 2009 (UTC)[reply]

At some point before calculus you should have learned something called "rationalizing the denominator". That's what's being done here. Michael Hardy (talk) 17:20, 3 April 2009 (UTC)[reply]

I know how to do this, I just totally blanked out. Possibly because I'd been learning multi-variable calculus all night, it sometimes happens to me that the simple stuff gets in my way when I've being studying heavy stuff for a long time. I'm not normally this stupid. --BiT (talk) 17:48, 3 April 2009 (UTC)[reply]
This is just in case you like to know what goes on in my mind. A second after I saw a voice in my head said Multiply top and bottom by . To me it's the 2-axis differential calculus that would be the difficult part. Cuddlyable3 (talk) 18:19, 3 April 2009 (UTC)[reply]
Directional derivatives are pretty easy, just a simple multiplication and square root + some dot product. I know I should have thought "multiply with and honestly I don't know why I didn't. I seems so obvious. --BiT (talk) 19:08, 3 April 2009 (UTC)[reply]
Don't worry about it. When I studied double integration in polar co-ordinates, I often used to forget the "r * dr * dθ" and instead only write "dr * dθ". It is really annoying to find out you have done that after calculating the integral and checking the answer but it happens. Calculus is really a somewhat mechanical subject. Often there is no intuition in the calculation (you can't guess the answer before calculating it out) and there can be many simple mistakes. But as you will see, analysis is not like this - it is more generalized compared to calculus and you don't to much "calculating" in many parts of analysis. Nevertheless, unless you study calculus, it is difficult to understand analysis, but this depends of course on what you call "analysis". Many textbooks claim to be a "textbook of analysis" but are really a "textbook on calculus". --PST 03:36, 5 April 2009 (UTC)[reply]

Fundamental Theorem of Algebra

[edit]

Hi, the article fundamental theorem of algebra says that there are no purely algebraic proofs of the theorem yet, and some believe there can be none. And all the proofs given are very complicated, too.

Doesn't it suffice to prove that may be written as for all complex and natural n, and for some complex ?

This can be shown fairly easily by induction, so there must be something I'm missing that makes the proof difficult. 79.72.167.169 (talk) 20:37, 3 April 2009 (UTC)[reply]

Yes, that statement is exactly the fundamental theorem of algebra (or almost exactly; authors differ on exactly what the FToA says). However, it cannot be easily proven by induction. What can be easily proven by induction is that given every non-constant complex polynomial has a root, your statement holds. If you have an easy proof that every non-constant complex polynomial has a root, I'd like to see it. Algebraist 20:41, 3 April 2009 (UTC)[reply]
I think I'm still missing something -- I don't quite get how that condition changes anything. For the induction I imagined multiplying both sides of the identity by ; the factored side is then what's needed, and the power series's coefficients just need to be shown to be aribtrary. By doing this, isn't your condition also proven at the same time? I can't see the place where it's being assumed and relied on in the inductive step, and the base case would just be the linear function, which obviously has a root and trivially satisfies my condition. 79.72.167.169 (talk) 22:52, 3 April 2009 (UTC)[reply]
Your induction makes no sense that I can see. Suppose I give you a degree n polynomial, and I also give you the fact that every degree n-1 polynomial can be factored as stated. How are you going to even start factoring the degree n polynomial? That's what you need to do to prove the induction step. Another way of reality-checking your proof is to work out at what point you use the fact that you're working over the complex numbers. It's not obvious to me that you're using that fact at all, but you certainly need to, since the theorem doesn't hold over arbitrary fields. Algebraist 23:01, 3 April 2009 (UTC)[reply]
The complex-numbers-versus-arbitrary-fields point is a good one; I can't see anywhere I'm specifically using any properties of the complex numbers that don't apply to general fields, but I'm not very familiar with abstract structures (the limit of my formal maths education is to A-level).
Here's my inductive step in full:
Given that: (for all complex ) (call this equation 1), the inductive step is to show that: (for all complex ) (call this equation 2).
Multiplying equation 1 by gives: (call this equation 3),
It is given that may be any complex number. The coefficients of the x terms are all a constant difference away from these, so these coefficients are also any complex numbers. As is also some arbitrary complex number, by a suitable selection of its value, the constant term is also any complex number. Hence, all the coefficients in eqn. 3 are entirely arbitrary complex numbers. Let these coefficients be Then eqn 3 is identical to eqn 2, which is what I was trying to prove, and thus if the theorem is true for n, it must be true for n+1 too.
This might be a glitch in my brain. It's pretty late here, and I'll take a fresh look in the morning. 79.72.167.169 (talk) 23:37, 3 April 2009 (UTC)[reply]
What is xn+1? You are implicitly assuming that the polynomial has a root, which is the condition you need to prove. --Tango (talk) 00:30, 4 April 2009 (UTC)[reply]
For starters, your multiplication of polynomials is wrong. The LHS should be . With the correct expression, it is not at all clear that you can choose ai and xn+1 so as to make the coefficients come out right whatever bi might be. Algebraist 06:33, 4 April 2009 (UTC)[reply]
As to the point of existence of purely algebraic proofs of the fundamental theorem of algebra, I wonder if "purely algebraic" has a precise meaning. I suppose that everybody agrees that beying algebrically closed is a purely algebraic property. But the question is: is the complex field a purely algebraic object? If your answer is yes, then I think that all proofs can be translated in a language that is purely algebraic to you. On the other hand, if your answer is no, I do not see then how could exist a proof ot this theorem that you may consider purely algebraic, because the fact that we are talking about has to enter somewere in the proof. --pma (talk) 21:42, 3 April 2009 (UTC)[reply]
If 'purely algebraic' had a precise meaning, I expect that question would've been resolved. I too am not sure what a purely algebraic proof would even look like. The most obvious purely algebraic definition of the complex numbers is as the unique-up-to-ismorphism cardinality-continuum algebraically-closed field, but that doesn't get us very far. Algebraist 21:46, 3 April 2009 (UTC)[reply]
Very interesting... Do you mean that any algebraically closed field with the cardinality of the continuum is isomorphic to ? So the algebraic closure of is again isomorphic to ? (PS: above I was referring to the definition of used in the FTA, like ; doesn't matter anyway).--pma (talk) 22:40, 3 April 2009 (UTC)[reply]
Sure, this is fairly easy. Any two ACFs of the same characteristic and transcendence degree are isomorphic, and transcendence degree is the same as cardinality for uncountable fields. Algebraist 22:34, 3 April 2009 (UTC)[reply]
Oh I see... Thank you! --pma (talk) 22:40, 3 April 2009 (UTC)[reply]
If you define the complex numbers in terms of the real numbers, that's analytic - I don't know an algebraic definition of the real numbers (although that doesn't mean such a definition doesn't exist). --Tango (talk) 00:32, 4 April 2009 (UTC)[reply]
Well, you can think of the real numbers "algebraically" as a Real closed field containing the field of rational numbers with continuum many transcendentals adjoined. The fundamental theorem of algebra then follows from the real closed property, definition number 3, as in the algebraic proof in the article, but this isn't much more than Algebraist's "doesn't get us very far" above.John Z (talk) 04:37, 4 April 2009 (UTC)[reply]
I'm not sure if you've made this mistake or not, but just in case: it's true that the reals are a transcendence-degree-continuum RCF, but that doesn't work as a definition in the way it does for the complexes. The theory of RCFs is richer than that of ACFs (o-minimal rather than strongly minimal) and this leads to more models: there are 2continuum non-isomorphic cardinality-continuum RCFs. Algebraist 06:25, 4 April 2009 (UTC)[reply]
Not sure if I was making that mistake when I wrote that late last night either.:-), but thanks for the explanation. Hard to imagine a more algebraic proof than the rcf one. What would a "purely algebraic" proof that positive reals have square roots be?John Z (talk) 22:22, 4 April 2009 (UTC)[reply]

Remmert's interpretation of "purely algebraic" is sort of explained on pages 109-110 of the book Numbers by Heinz-Dieter Ebbinghaus & John H. Ewing, which can be read at books.google.com (but maybe you need to log in). I can't copy/paste from there but basically he means algebraic as opposed to analytic. He writes "Many mathematicians believe there can be no purely algebraic proof, because the field R, and consequently its extension field C, is a construct belonging to analysis." He considers the existence of square roots of complex numbers to be an analytic fact, for example. Hmmm. McKay (talk) 00:55, 4 April 2009 (UTC)[reply]