Reading over that, I actually recognize I miss the older HN - where the conversations seemed fairly long and in depth. It's still often the case, but it does seem like the average length and depth is decreasing.
I completely agree. There are a few contributing factors, none of which are unsolvable. But it seems like the way to solve it is to essentially fork HN.
The current plan is to create a "mirror" of HN's front page, but with a smaller community. You'll be bale to keep your HN name, because you won't be allowed to use someone's existing HN name on the new site unless you claim it.
Then, we'll need content. This was the secret sauce in the early HN days. The sole reason for people to come to the new site would be good conversation, and I think recruiting a few of the writers from HN would be enough to bootstrap it. Part of HN's early success was that pg actually participated in HN. Whereas the current mod team just shows up to beat us over the head and little else.
If you want to help plan this out, contact me (email's in profile). I think we have a shot at this. Some past thoughts on the topic:
> Jonathan Pace is a 51-year old Electrical Engineer living in Germantown, Tennessee. Perseverance has finally paid off for Jon - he has been hunting for big primes with GIMPS for over 14 years.
Bitcoin doesn't pay out cash when you find a block. It pays out Bitcoins. So the value of that is based on how much other people will pay for a bitcoin.
so I didn't know this, but got curious about how many known prime there are - I knew there were infinite primes, but thought that there would be some concrete list of all the primes that we had discovered somewhere - but apparently not
> Nobody's really keeping count. ... There are very many hundred-digit primes to find. We could cover the Earth in harddisks full of distinct hundred-digit primes to a height of hundreds of meters, without even making a dent in the supply of hundred-digit primes.
I also used to think that there weren't very many primes. But the prime number theorem says that the number of primes less than n is about n/log(n). The function log doesn't grow very fast, so a large proportion of numbers are prime. For example the number of primes less than 10^100 is 4*10^97.
> so I didn't know this, but got curious about how many known prime there are
This isn't the exact question you're asking, but we actually know the distribution of prime numbers, which allows us to calculate the (approximate) number of primes that are less than or equal to an arbitrary value.
Since the largest prime discovered is 2^(277,232,917-1), that means that the number of primes less than or equal to that number is approximately equal to 2^(277,232,917-1)/ln(2^(277,232,917-1)).
That's approximately equal to:
2^(277,232,917-1)/ln(2^(277,232,917)), which is in turn equal to 2^(277,232,917-1)/(277,232,917 * ln(2)).
That's a number that's too big to plug into your standard everyday calculator, but that tells you the number of primes you could "discover" and still not break the (new) record.
Not sure if this was a serious comment, but if it was: due to operator precedence in mathematics, the expressions (2^77,232,917)-1, 2^(77,232,917)-1, and 2^77,232,917-1 are identically evaluated, with the parentheses only used to aid the human eye. This in contrast to 2^(77,232,917-1), which is 2^77,232,916 and is a very different number indeed.
> but thought that there would be some concrete list of all the primes that we had discovered
It is trivial (in computational power) to find a random prime in order of 2^2048. So such a list would be way to big to store somewhere and always be out of date. It is not hard to find very big primes. It is just hard to find very very big primes.
RSA encryption (https://en.wikipedia.org/wiki/RSA_(cryptosystem), used in PGP, TLS certificates, among others) is based on the fact that I know a pair of primes you don’t know, so even if somebody tried to keep such a list, it better be incomplete.
not really, it is very hard to test any very large number for primality... this project only test numbers that satisfy the Mersenne prime definition though... 9 of the 10 known largest primes are Mersenne prime numbers probably because there is more people testing these numbers.
People test these because it's the easiest pattern we know of for constructing large primes. So if you want to maximize your chances of finding a large prime, you'll check for the mersenne form.
people test these because we have the technology to check if a large number of the Mersenne form is prime at a faster speed than numbers that do not conform.
That list would be as long as the list of "all even numbers" because any list of primes immediately gives you a next prime that should be on the list, but isn't, in the same way that having a list of even numbers immediately gives you the next even number that should be on that list but isn't:
Given an ordered list of primes {2,3,5,...,n} one (of several) immediately known next prime number is simply (2 x 3 x 5 x ... x n) + 1. We know that number's prime because it cannot be cleanly divided by 2, or 3, or 5, or ..., or n.
As such, even if it might be useful to have a list of all known primes, annotated with what kind of prime each number is, such a list cannot be constructed: even just the subset of all prime numbers generated based on the simple above rule would be infinitely long, just like the list of all even numbers is infinitely long.
Interestingly, those primes' primality is normally proven statistically rather than deductively. This is not really a practical issue for people using RSA, but could be a philosophical issue for someone interested in the question of how many different numbers' primality has been proven by humanity.
Not that I worry about this much, but if the fast prime generation methods we use are imperfect, then how bad is it when someone's PGP key or TLS certificate is based on nonprimes? Do black-hats ever strike gold and find out that someone's key is easily factorable? Or is there an additional property of common primality tests that their errors tend to be insignificant? (e.g., yes, this number isn't really prime, but its factors are very likely to be so large that they are also impractical to find, rather than, say, divisible by 7.)
I think there was some research about this that did find some flawed implementations leading to semiprime p or q, but basically you can choose the probability that a probable prime is actually composite to be low enough that the expected number of weak keys ever generated for this reason is still less than one.
(I haven't personally understood the probabilistic primality tests well enough to understand how they guarantee that the probability of error contributed by each round of the test is completely independent of every other round.)
In fact if someone worked out how to crack RSA when it turned out that one of those numbers weren't prime then they would have invented a fast primality check. Which would be really cool in itself, and allow RSA to be fixed.
When philosophers study mankind's knowledge, it's more typical to study the things mankind would ideally know at the end of all time assuming mankind could continue forever.
(Formally: the knowledge predicate is usually assumed to satisfy modus ponens: if mankind knows "A implies B" and mankind knows "A", then mankind knows "B")
Under this abstraction, if mankind knows the axioms of logic and Peano arithmetic, then mankind [ideally eventually] knows the primality of all primes.
Contrast: Which Turing machines will mankind [ideally eventually] know are never-halting? This is a much harder question and the answer is probably not "all never-halting Turing machines"
It's a little funny that mankind gets to use a full-fledged infinite-tape Turing machine in order to compute arbitrarily large computations, since no such machine would fit in our universe.
Sure, but it feels a little funny in one way to say that people "know" all of its output, although I'd agree that for other purposes being able to write a program to generate something is a relatively good description of what it means to understand it.
RSA implementations do use probabilistic primality tests when generating primes. OpenSSL [0] uses the Miller-Rabin primality test for RSA, which can be wrong, but they use enough iterations to make it very unlikely.
I don't know to what confidence ssh-keygen and the like test their candidates, but it's not hard to make the probability for composite smaller than the probability that the computer doing a proper primality test is hit by an asteroid while doing the computation.
tl;dr The 50th Mersenne prime was just found by a volunteer of the GIMPS (Great Internet Mersenne Prime Search) project. It is 2^77,232,917-1 and has 23,249,425 digits. Mersenne primes are extremely rare and are always of the form 2^p-1 for some positive integer p. The first four Mersenne primes are 3, 7, 31, and 127.
One answer is a bit buried in a sub link in the article. On that page, you’ll find arguments for the following reasons: tradition, by products of the quest, collection of rare mathematical things, glory, pushing hardware performance, and contest rewards.
Personally I’m forced to admit I enjoy seeing them found while being unable to form any cogent justification.
That's actually a fairly complete and accurate list of reasons for such a project. If I was interested, I'd do it for about half of those reasons, and others may prefer the other half.
I was watching a math documentary and one example was a Cicada in North Carolina that only emerges once every thirteen years millions of them at once. It's a defense mechanism the sheer number overwhelms predators. The Cicada does this also to avoid appearing when another species of Cicada appears to prevent cross breeding.
The other species in the same region emerges every 7 years. The two will only emerge at the same time every 220 years (I think it as).
This is a similar concept to the odd numbers of teeth used in gear chains. If you had an 8-tooth and a 16-tooth gear, they will wear unevenly as the same teeth (with potential manufacturing defects) always meet the same teeth. If you change that to 7-tooth and 17-tooth, they will only repeat pairings every 119 teeth and wear will be distributed evenly across all teeth. In general, tooth numbers that are relatively prime (sharing no divisors) are preferred.
January looks to be a good month for discovering large prime numbers! As a former contributor to the project it would be great to have a primer on the best way to contribute today, covering the options for CPUs vs GPUs and the various projects for factorizations vs primality tests.
I use Prime95 to test for usage-related recording lags for sound and video recording in our lab equipment. If we're not getting signal de-synchronization with that slamming the CPU(s), we can worry a bit less.
As someone who knows nothing about advanced mathematics, could someone explain why this matters? (i.e. beyond that this is rare and theoretically interesting)
> Mersennes are beautiful and have some surprising applications.
Unfortunately that page doesn’t elaborate on what these surprising applications are, which is itself surprising on a page that purports to answer “why”.
Euler and Gauss are (for now, at least) in my opinion the greatest mathematicians of the past two millenia (1000-1999, 2000-). Al-Khawarizmi takes the cake for the millennium before that. Then it's Euclid all the way back ;)
We know almost nothing about Euclid: we can figure out when he was active to within a century or two, and according to Pappus writing 500 years later some of his students/followers lived in Alexandria where Apollonius studied with them. That’s pretty much it for biographical details.
The earliest remaining editions of the Elements have no author mentioned, and our source that Euclid compiled it is a brief remark from Proclus 700 years later. Most of what is in the Elements was results from earlier, and it’s all but impossible to break down which bits were first done when or by whom. Most of what we can see today of the Elements or Euclid’s other books is later copies, much of it probably added/changed/reordered/... later.
If you want a 2000-year-old idol, go for Archimedes.
In the last 1000 years, the most influential mathematician is surely Newton, with an honorable mention for Leibniz. For the computer age (from 1950 through the upcoming few centuries), I’d put my vote on Grassmann (1809–1877), though his work was long ahead of its time and still substantially underappreciated.
Euler and Gauss were of course both brilliant and prolific and well worth studying, along with Descartes, Lagrange, Riemann, Poincaré, ....
Curious: can bitcoin mining rigs be modified for BOINC projects (GIMPS, Seti@Home, etc.)? If yes, than I might invest in some rigs and modify them to run BOINC.
Yes, I'm weird: I prefer to do computation for BOINC than for bitcoin. :-)
I do have some PC systems but they are shared among family members, making it difficult to run BOINC as a background task without interfering with their work.
I'll look at low-cost systems, like the Raspberry Pi, to see if I can use them as dedicated BOINC boxes instead.
Indeed, Bitcoin ASICs are not only "optimized" for the Bitcoin mining task (computing a particular hash function), they usually literally don't include the logic to perform other general-purpose computations at all! That's a big contrast with GPUs.
There have been interesting discussions about a cryptocurrency whose proof of work task would be something in some way more interesting or more useful than partial hash collisions, but I don't think many such systems have caught on. There is a prime-related one called Primecoin:
So, I guess that's a precedent for creating new cryptocurrency designs that do something else. I don't know if there's a way to make any of the BOINC tasks into cheap-to-verify PoW systems or if anyone's tried to do so, but that might be a cool project.
There's no cryptographic use in finding a largest prime. It's far larger than any prime used in a practical system and doesn't contribute any mathematical knowledge about primes.
It's mostly just running an algorithm for long enough to pay off.
There is rarely a submission here on Hacker News that has comments of such bad quality as this submission has.
I see a similar phenomenon with press. They "hype" something they barely understand, change parts of the story to make it more interesting, or invent new words (Cyber!). This is a disservice.