John L. Taylor

178 posts

Review of Debunked

The New York Review of Books features Freeman J. Dyson’s review of Debunked!. Here is a particularly interesting excerpt: The book also has a good chapter on “Amazing Coincidences.” These are strange events which appear to give evidence of supernatural influences operating in everyday life. They are not the result of deliberate fraud or trickery, but only of the laws of probability. The paradoxical feature of the laws of probability is that they make unlikely events happen unexpectedly often. A simple way to state the paradox is Littlewood’s Law of Miracles. Littlewood was a famous mathematician who was teaching at […]

Culture Jamming

A while ago I made reference to culture jamming without much explanation. Culture Jamming is the practice of challenging, countering, or critiquing popular culture by co-opting its images, language, and methods. Appropriately, the term ‘jamming’ is borrowed from radio culture, meaning to borrow or block signals. There is no universal philosophical doctrine culture jammers abide by, but there are a handful of interesting ideas floating around. First, the failure of rational argumentation to win the minds of the public: Once upon a time, the “Evils of the Establishment” were subject to rational critique by academics and revolutionaries. Most people still […]

Blockheads and Other Brutes

Ned Block defined a system known today as a Blockhead (“Troubles with Functionalism”, Minnesota Studies in the Philosophy of Science) to illustrate a problem that look-up tables pose for the Turing test. Blockhead is a “stupid” machine that stores all possible conversations within some limited duration and, thus, passes the Turing Test. This is, of course physically impossible, as Frank Tipler argues with back of the envelope calculations in The Physics of Immortality: … the human brain can code as much as 10^15 bits is correct, then since an average book codes about 10^6 bits, it would require more than […]

Philosophy in Runtime

In the past few months I have been challenged to defend computational philosophy, particularly philosophical modeling. Philosophical modeling, like scientific modeling, is a formalization process. However, instead of capturing real-world phenomena, philosophical modeling captures thought experiments. The process of encoding a thought experiment in a formal system is, itself, beneficial in the same way as standard conceptual analysis: hidden assumptions are unburied and seemingly simple ideas yield refined notions. But, in a way, encoding is more honest–the process is not satisfied until you reach a syntactic, algorithmic level of explicitness and, once our intuitions are encoded, further light may be […]