On 3rd July a member on Digital Point posted a snippet of code he got when a query for a cached page from a Google datacenter (64.233.183.104) went wrong. The following is rarely seen, Google’s Error Code:
pacemaker-alarm-delay-in-ms-overall-sum 2341989
pacemaker-alarm-delay-in-ms-total-count 7776761
cpu-utilization 1.28
cpu-speed 2800000000
timedout-queries_total 14227
num-docinfo_total 10680907
avg-latency-ms_total 3545152552
num-docinfo_total 10680907
num-docinfo-disk_total 2200918
queries_total 1229799558
e_supplemental=150000 –pagerank_cutoff_decrease_per_round=100 –pagerank_cutoff_increase_per_round=500 –parents=12,13,14,15,16,17,18,19,20,21,22,23 –pass_country_to_leaves –phil_max_doc_activation=0.5 –port_base=32311 –production –rewrite_noncompositional_compounds –rpc_resolve_unreachable_servers –scale_prvec4_to_prvec –sections_to_retrieve=body+url+compactanchors –servlets=ascorer –supplemental_tier_section=body+url+compactanchors –threaded_logging –nouse_compressed_urls –use_domain_match –nouse_experimental_indyrank –use_experimental_spamscore –use_gwd –use_query_classifier –use_spamscore –using_borg
I am not for 1 minute suggesting that I know anything about this code, but 2 things stand out to me; “use_experimental_spamscore” and “use_spamscore –using_borg”. So as well as having an algorithm based on scoring an individual page based on linkage data and on-page factors, Google also seem to have a new scoring algorithm for spam. Could this be their reaction to the recent bad data dump or just a refinement on how they filter out spam?
Aside from this, Google are now using Borg technology. How they managed to get in touch with the Borg collective is beyond me, but Yahoo and MSN are in serious trouble now. Having said that, maybe Google and the Borg did some sort of Capt. Janeway-style collaborative work? Maybe Google will announce a deal with Unimatrix One to place Adwords in Borg cubes? 😉 Sorry , couldn’t resist. Crap that’s another bad pun.