Jump to content

Wikipedia:Reference desk/Archives/Mathematics/2012 October 27

From Wikipedia, the free encyclopedia
Mathematics desk
< October 26 << Sep | Oct | Nov >> October 28 >
Welcome to the Wikipedia Mathematics Reference Desk Archives
The page you are currently viewing is an archive page. While you can leave answers for any questions shown below, please ask new questions on one of the current reference desk pages.



October 27

[edit]

Pi or Tau

[edit]

whats better, pi ( Circumference/Diameter ) or tau (Circumference/radius)?203.112.82.2 (talk) 00:27, 27 October 2012 (UTC)[reply]

Personally, I like pi's positions on the economy, but tau's stance on defense is better. In all seriousness though... "better" in terms of what? You'll have to be significantly less vague. --Kinu t/c 01:49, 27 October 2012 (UTC)[reply]

The discussion is between [1] and [2]. Personally I think that tau is a much better choice. Bo Jacoby (talk) 03:41, 27 October 2012 (UTC).[reply]

Or from a different perspective: had academic inertia been on the other side (i.e. if τ had been the one to become mainstream instead of π), it is difficult to imagine a movement suggesting that π = τ/2 should replace τ occurring. — Quondum 05:46, 27 October 2012 (UTC)[reply]

Both of them are rather good in making puns (provided you pronounce τ using the ancient Greek pronounciation [taw], and not the modern [taf], of course).Double sharp (talk) 07:59, 27 October 2012 (UTC)[reply]

I like pi/3 because it's close to 1. Count Iblis (talk) 16:49, 27 October 2012 (UTC)[reply]

Tau is better.
In response to the pi manifesto:
  • The area of a sector is , which itself follows from the area of a triangle being . As such the natural way to express the area of a disc is (as it is a sector with angle tau) rather than .
  • The standard normal distribution should of course be then one with a variance of 1. The half in the exponent is just there; and then you may as well group the 2 with the pi.
Of course, I wouldn't actually use tau after years of familiarity with both the number pi and the letter used to represent it. -- Meni Rosenfeld (talk) 20:01, 27 October 2012 (UTC)[reply]
How familiar with the number pi do you claim to be, Meni? Her full panoply of digital secrets has never yet been revealed, although many so-called mathematicians amuse themselves with the idea that computing a zillion quadrillion digits is somehow getting them closer to the Whole Truth, or even at least showing them some interesting scenery along the Road to Nowhere. It doesn't even do that, unfortunately. She knows all there is to know about playing hard to get. -- Jack of Oz [Talk] 20:27, 27 October 2012 (UTC)[reply]
The number pi is a very different thing from the decimal digits of pi. The decimal digits (even if you have all of them as a completed infinite totality) are a fairly inessential surface representation of the underlying Platonic reality. --Trovatore (talk) 22:44, 27 October 2012 (UTC)[reply]
Well I did memorize 90 decimal places 15 years ago (and yes, still remembering them). It's a bond that can never be broken. -- Meni Rosenfeld (talk) 20:40, 27 October 2012 (UTC)[reply]
I for one don't see why using tau would be that much better than using pi. Either would occur together with small rational factors in formulas just as frequently. If you like appeal to authority, let me note how Ken Iverson, who is famous for innovations in mathematical notation, appears to have preferred pi times over tau times. – b_jonas 11:33, 28 October 2012 (UTC)[reply]
I must link to xkcd comic strip 576: Urgent Mission. – b_jonas 11:54, 28 October 2012 (UTC)[reply]

Identifying Rings

[edit]

I am asked to identify the ring . What does this mean? Does it simply mean "the integers adjoined with ", for exemple? Or is it isomorphic perhaps to a more well known group (for example, the complex numbers are isomorphic to ) (but I don't think it's isomorphic to the algebraic numbers). --AnalysisAlgebra (talk) 10:40, 27 October 2012 (UTC)[reply]

It doesn't seem like a very well posed question, but I guess they just mean find some simpler looking, or more familiar ring that this one is isomorphic to. It turns out these relations kill quite a lot of stuff and you end up with something pretty small. Rckrone (talk) 20:17, 27 October 2012 (UTC)[reply]
Oh well. Thank you vor your help anyway. — Preceding unsigned comment added byAnalysisAlgebra (talkcontribs) 00:56, 28 October 2012 (UTC)[reply]
I presume the notation implies that the relations and hold simultaneously. Adding these, this would imply that . I'm guessing that isomorphic to the Gaussian integers. The relation would then restrict these to a very small finite ring, as Rckrone suggests. — Quondum07:33, 28 October 2012 (UTC)[reply]
Where did you get ? Are you sure you don't mean ?
is isomorphic to the smallest ring containing and elements satisfying . So is isomorphic to the smallest ring containing , , and , which is basically all possible sums and products of those elements, and is .--AnalysisAlgebra (talk) 09:44, 28 October 2012 (UTC)[reply]
Oops, yes: . (I understand that you get to ℂ from ℝ, but you're starting from ℤ in this instance.) I'd follow the the equivalences, but I'm on unsure ground here. I seem to end up with a rather small ring. — Quondum 11:10, 28 October 2012 (UTC)[reply]

Functions

[edit]

Spivak states that "in almost every branch of mathematics, functions turn out to be the central objects of investigation", or something similar. To what extent is this true? What are the most important exceptions? (License for personal opinion is clear, I hope.) —Anonymous DissidentTalk 14:00, 27 October 2012 (UTC)[reply]

The number of branches of mathematics is not constant. Nor is it well defined which objects of investigation are "central". The statement is subjective and unprecise and the extent of truth cannot be established. The obvious exceptions are of course the branches that predates the concept of "function". Bo Jacoby (talk) 19:18, 27 October 2012 (UTC).[reply]
The statement would ring truer IMO if it read "in almost every branch of mathematics, functions turn out to be central to the investigation of objects". —Quondum 11:21, 28 October 2012 (UTC)[reply]
Yes, this sounds even more true. --pma 18:55, 28 October 2012 (UTC)[reply]

World series

[edit]

In the baseball World Series, the winner of 4 of the 7 games wins the series. What are the chances, given that team A has won the first 3, that team B will win the last 4 games, and thus the series ?

Obviously, you can go off the historic record, but it only goes back a century or so, and changes in the rules might make old World Series less useful for statistics. Incidentally, I believe this 3 then 4 pattern happened only once, in 2004.

But I'm more interested in the Bayesian probability approach. Is this a reasonable way to approach the problem ?

A) Assume each game to be an independent event. That is, whatever team A's probability of winning the first game, this is the same probability they will win each of the remaining games. So, no "momentum", considering where the games are played, etc.

B) For the first trial, assume a 1/2 probability of winning. Thus, we get a 1/128 probability for team A winning the first 3 and team B winning the last 4 and the series. (If we allow for the possibility of reversing who wins the 3 games and who wins 4, this doubles the probability to 1/64, but I'm not concerned with that.)

C) However, if we assume that the team which will win 4 games is better, say winning 2/3 of their games, we then get (1/3)3(2/3)4 or 16/2187 or 1/136.6875. Looks like 2/3 is going too far.

D) Therefore, should we assume the winning team has a 4/7 chance of winning, since that will be their record in the World Series, if this situation occurs ? This gives a probability of (3/7)3(4/7)4 or 6912/823543 or about 1/119.146846.

So, is that the best answer ? StuRat (talk) 22:58, 27 October 2012 (UTC)[reply]

No. To do it in a Bayesian way, you don't just need to assume a prior probability for A winning, you need to assume a probability distribution for the probability of A winning. Once you have that, you can update it based on observing that A has won 3 times.
You can arrive at a suitable distribution by looking at all statistics of the game, not just for A and B.
A popular choice for a prior distribution for the probability of an event is the beta distribution, because it is a conjugate prior for this problem - that is, once you collect data, the updated distribution is again beta. The highest-entropy choice is , which gives a uniform distribution; but that will not likely be adequate for baseball.
Don't forget that you asked for the probability of B winning the last 4 games given that A won the first 3. This means you don't multiply by the probability of A winning the first 3. -- Meni Rosenfeld (talk) 06:29, 28 October 2012 (UTC)[reply]
Oops, I meant the total probability. So, how does this work out ? StuRat (talk) 19:22, 29 October 2012 (UTC)[reply]

Prior to playing 7 games there are 8 equally probable outcomes: team A wins K times and looses 7-K times, where 0≤K≤7. After 3 games teamA won k=3 times and lost 3-k=0 times. This can happen in ways. The requested Bayesian probability is Bo Jacoby(talk) 09:50, 30 October 2012 (UTC).[reply]

[ec]I think the only interpretation of this that makes sense is, "given that A has won 3 times, and that A and B go on to play 7 more games, what is the probability of A winning the first 3 of these 7 games and B winning the last 4?". Then the calculation is as follows:
  • Start with a prior distribution for p, the probability of A winning, typically Beta with parameters . (It is assumed thatp itself is fixed, the teams don't get any better or worse throughout the competition).
  • After observing 3 wins by A, update your distribution to Beta with .
  • Given p, the probability for A winning the first 3 of the next 7 matches and B winning the last 4 is .
  • So given your distribution for p, which is , the probability of this happening is:
  • For a uniform prior this gives . For a somewhat more realistic this gives . -- Meni Rosenfeld (talk) 09:56, 30 October 2012 (UTC)[reply]
Bo, that's not how probability works; and you basically lost all credibility by saying that all 8 outcomes start out equally probable. (When tossing a coin 100 times, is 50 heads as probable as no heads?) -- Meni Rosenfeld (talk) 09:59, 30 October 2012 (UTC)[reply]

Sorry Meni, but you are mistaken. Tossing coins is not the same thing as playing baseball. When you toss coins you know from lots of earlier experience or from symmetry that the odds are 1 to 1, and further experience provides little further information. In this case of playing baseball you have no prior information about the relative strengths of team A and team B and so it is not less probable that K=0 than that K=3. See for exampleLaplace#Inductive_probability. Bo Jacoby (talk) 17:33, 30 October 2012 (UTC).[reply]

But you do know from prior experience that the distribution of winning probabilities is not uniform. You're not sure that A has 1/2 probability of winning B but you know the probability will likely be close to 1/2, so K=3 is more probable than K=0.
Hypothetically, you could know from prior experience that the distribution is even more variable than uniform, but I doubt this would be the case for baseball.-- Meni Rosenfeld (talk) 19:25, 30 October 2012 (UTC)[reply]

You don't need winning probabilities and beta distributions and integration and all that jazz. All you need to know is the prior probability distribution forK. The problem description gives no reason to deviate from the principle of insufficient reason. If you have a better prior then you can use it right away. It doesn't change the final result much. Bo Jacoby (talk) 00:03, 31 October 2012 (UTC).[reply]

The problem description tells us we are talking about baseball, that's reason to deviate from nameless abstract entities into what we know about baseball.
In order to derive the distribution of K you need to integrate over the distribution of p anyway, and it just obfuscates things. -- Meni Rosenfeld (talk) 06:21, 31 October 2012 (UTC)[reply]

What, then, is the prior distribution of K ? Bo Jacoby (talk) 07:35, 31 October 2012 (UTC).[reply]

To determine you need, as I said, to look at baseball statistics. For the special case of a uniform prior for p, , the prior of K is also uniform.
By the way, to determine these integrals you can use the general rule that . -- Meni Rosenfeld (talk) 09:04, 31 October 2012 (UTC)[reply]

We agree that K is uniformly distributed prior to playing, and prior to taking unknown previous baseball statistics into account. The problem is strictly combinatorial. The continuous variable p does not need to contaminate the solution. Bo Jacoby (talk) 09:34, 31 October 2012 (UTC).[reply]

It's turtles all the way down. Even before you look at baseball statistics you know that baseball is a game and winning probabilities in games are typically not distributed uniformly. You can't take an idealized combinatorial solution which has nothing to do with the problem at hand and pretend you're doing Bayesian statistics.
Even if we did agree on a uniform prior, the combinatorial approach merely obfuscates what is really going on. -- Meni Rosenfeld (talk) 11:13, 31 October 2012 (UTC)[reply]

I suppose we now agree that the answer is 1/70? The 8 possible numbers of wins in 7 unplayed games have equal prior probabilities unless there is compelling evidence against it, which there is not, even if we know that the games are baseball games. Bo Jacoby (talk) 13:37, 31 October 2012 (UTC).[reply]

We have always agreed that if the prior of p is uniform then, given that A has won 3 games, the probability that B will win the next 4 games is 1/70. As we have always disagreed on the prior of p being uniform. (The number 1/70 isn't in my results above since StuRat said he wants the total probability, rather than the conditional probability, and I did my best to interpret that.) -- Meni Rosenfeld (talk) 14:13, 31 October 2012 (UTC)[reply]

Thank you for restoring my credibility :) Bo Jacoby (talk) 08:01, 1 November 2012 (UTC).[reply]