Published May 31, 2016
computers , economy , internet , mathematics , Probability , science
Tags: businesses, economy, information, internet, privacy, software
A new advance on a method known as a randomness extractor makes it easier for machines generate truly random numbers by harvesting randomness from the environment.
The new randomness extractor combines two independent sources of weakly random numbers into one set that is nearly random, with only minor deviations. Then the researchers use a “resilient function,” a method of combining information, to turn the string of numbers into one truly random bit — a 1 or 0.
Compared with the previous state-of-the-art randomness extractors, which required input that was already very close to random, the new method can mine sources that are “much, much, much, much weaker,” says computer scientist Avi Wigderson of the Institute for Advanced Study in Princeton, N.J. The new extractor is a “substantial improvement over the previous results, and it’s very close to the best you can hope for.”
As the guardian reports, wikipedia is now only adding 1300 articles per day, as opposed to 2200 a couple of years back. This can be normal though, as there are obviously less things left to write about. But the article also points out a much more important thing, that the dynamics of the community are changing. The few elite editors are slowly becoming a closed class, where a newcomer has trouble entering. This is a common problem to all communities. The elders don’t have any respect for the newbies as they expect them to devote the same amount of effort to the project as they did. This is to be expected and the newbies, well, they just have to deal with it.
However, wikipedia is not like other open source projects. It bears a power beyond measure, and it’s not a question if the project will go on. Even without up to date information, wikipedia will always be an invaluable resource for information. And it’s beacuse of that power that it can be a target for editors with a very specific agenda. So how can a free encyclopedia remain objective ?
The short answer is that it cannot. There is no way for wikipedia to contain all aspects of a historical event. There is no way to be absolutely right on anything you say about cats. Wikipedia is not and will never be an authoritative source. It’s extremely valuable, extremely informative, but not authoritative. It has without a doubt a much less percentage of junk and misinformation than the rest of the internet. It is a huge and extremely well done project, and the world is a little better today because of it.
Did you know that the internet is only 5000 days old?
Can you imagine how it will be after another 5000 days ?
Kevin kelly gives some insight: