binary128
Dormant account
- Joined
- May 18, 2009
- Location
- Vancouver, BC
zanzibar,
Re:
I have no problem at all with any of your questions. I am sorry if you are inferring such, as that is not at all my intention. I have a high level of confidence in my code (I've been through three audits, the first 2 of which could conservatively be called "grueling"), and I really enjoy the discussion.
As I mentioned above:
The only part of the scope of this discussion to which I cannot put a number - exactly what is the result for this init-by-array method? I think it would be logical to conclude that it seeds the Twister to a state greater than 2^32 initialized permutations. However, whether the result is less than, greater than, or equal to 2^225 - I honestly don't know.
(If you have a choice between having a colonoscopy or wading through a Google search on init_by_array, take the colonoscopy.)
As to atom collisions - who knows? Could be? Does anyone really know what goes on in the bowels of CERN?
But no, sorry to disappoint; there ain't no rocket surgery goin' on at Galewind.
As I understand it, the issue with hardware-related RNGs is the ability to link it to application code. That is, there has to be an API that responds at the microsecond level. And you also need to have a backup (hardware or software?) in the event of failure.
But, to be accurate, I know jack shit about all of that.
Re: your link to the Worsley School article. The one I liked:
"What this means, in practical terms, is that every time you shuffle a deck of 52 cards, you get a new arrangement of the cards that has never been seen by anyone before, and will never occur again!"
Chris
Re:
As I said, I'm not trying to create a problem here because I don't think one exists. Its more an interesting mathematical / theoretical / philosophical discussion. I know I am repeating myself but I want to make it really clear to anyone reading this that I am not trying to harm your businesses.
I have no problem at all with any of your questions. I am sorry if you are inferring such, as that is not at all my intention. I have a high level of confidence in my code (I've been through three audits, the first 2 of which could conservatively be called "grueling"), and I really enjoy the discussion.
As I mentioned above:
The introduction of the init-by-array method, which still uses 32-bit integers but 624 of them, "solves the problem". (To be exact, on this one I do not have quantitative documentation, only the statements published by the Twister's developers when this method was introduced back in 2002, and also by others in the field. But based on this I have proceeded with confidence.)
The only part of the scope of this discussion to which I cannot put a number - exactly what is the result for this init-by-array method? I think it would be logical to conclude that it seeds the Twister to a state greater than 2^32 initialized permutations. However, whether the result is less than, greater than, or equal to 2^225 - I honestly don't know.
(If you have a choice between having a colonoscopy or wading through a Google search on init_by_array, take the colonoscopy.)
As to atom collisions - who knows? Could be? Does anyone really know what goes on in the bowels of CERN?
But no, sorry to disappoint; there ain't no rocket surgery goin' on at Galewind.
As I understand it, the issue with hardware-related RNGs is the ability to link it to application code. That is, there has to be an API that responds at the microsecond level. And you also need to have a backup (hardware or software?) in the event of failure.
But, to be accurate, I know jack shit about all of that.
Re: your link to the Worsley School article. The one I liked:
"What this means, in practical terms, is that every time you shuffle a deck of 52 cards, you get a new arrangement of the cards that has never been seen by anyone before, and will never occur again!"
Chris