The GCP's methodology is based on the hypothesis that events which elicit widespread emotion or draw the simultaneous attention of large numbers of people may affect the output of hardware random number generators in a
statistically significant way. The GCP maintains a network of hardware random number generators which are interfaced to computers at 70 locations around the world. Custom software reads the output of the random number generators and records a trial (sum of 200 bits) once every second. The data are sent to a server in Princeton, creating a database of synchronized parallel sequences of random numbers. The GCP is run as a replication experiment, essentially combining the results of many distinct tests of the hypothesis. The hypothesis is tested by calculating the extent of data fluctuations at the time of events. The procedure is specified by a three-step experimental protocol. In the first step, the event duration and the calculation algorithm are pre-specified and entered into a formal registry.
[10][
non-primary source needed] In the second step, the event data are extracted from the database and a
Z score, which indicates the degree of deviation from the null hypothesis, is calculated from the pre-specified algorithm. In the third step, the event Z-score is combined with the Z-scores from previous events to yield an overall result for the experiment.
The remote devices have been dubbed
Princeton Eggs, a reference to the coinage
electrogaiagram, a
portmanteau of
electroencephalogram and
Gaia.
[11][
non-primary source needed] Supporters and skeptics have referred to the aim of the GCP as being analogous to detecting "a great disturbance in
the Force."
[2][12][13]