## Setting up a caching nameserver in Ubuntu

Install dnsmasq by typing sudo apt-get install dnsmasq and then edit /etc/resolv.conf so that the first line reads nameserver 127.0.0.1 or points to some other address on the local machine, if desired. That’s it.

## Getting Firefox to open links clicked from within Thunderbird

In Thunderbird:

Edit –> Preferences –> Advanced –> General –> Config Editor

Right click in the whitespace –> New –> String

Name of string is “network.protocol-handler.app.ftp”, value is “path and command to start firefox”

Do this again for strings “network.protocol-handler.app.http” and “.https”

## Recursive justification vs. probabilism

We are like sailors who on the open sea must reconstruct their ship but are never able to start afresh from the bottom. Where a beam is taken away a new one must at once be put there, and for this the rest of the ship is used as support. In this way, by using the old beams and driftwood the ship can be shaped entirely anew, but only by gradual reconstruction.

— Otto Neurath

This sort of recursive justification doesn’t seem to work very well, at least not according to the laws of probability. Consider the simple case where A and B are used to justify C, B and C are used to justify A and A and C are used to justify B.

Since C is derived from A and B, the probability we assign to C cannot exceed the greater of P(A) and P(B), and should in fact be less e.g. to account for the possibility that we’ve reasoned incorrectly and mistakenly concluded that A and B imply C. The same applies to each of the other claims and whatever other claims serve as their basis.

Since all beliefs are to be subject, at least according to a thoroughgoing probabilist, to review, the probability of each must be less than 1. How might this actually work out? What probabilities could we assign that satisfy some basic rules concerning probabilities so that

P(A) < max[P(B),P(C)] < 1

P(B) < max[P(C),P(A)] < 1

P(C) < max[P(A),P(B)] < 1

and correspond to believing A, B and C and yet still leaving open the possibility of revision along Bayesian lines or something similar so that

m < P(A) < 1

m < P(B) < 1

m < P(C) < 1

where m is the minimum sufficient degree of belief such that assigning P(X) = m is equivalent to believing that X?

As it happens, this is a problem without a solution. That is, if our confidence in each of our beliefs comes from our ability to derive it from some subset of our other beliefs, then any level of confidence is unreasonable, according to the laws of probability. Why? Because if we hold N beliefs, recursive justification implies that our degree of confidence in any belief can be no greater than the degree of confidence that we have in the most strongly held supporting belief and so on, implying some clearly impossible relation along the lines of

P(Belief 1) < P(Belief 2) < P(Belief 3) < … P(Belief N) < P(Belief 1).

One attempt at rehabilitation of recursive justification might be to suppose that each belief can be justified by multiple non-intersecting subsets of other beliefs. For example, A is implied by (B,C) and also by (D,E). For the moment, assume away the possibility of mistaken inference to A, so that P(A) = P((B,C) or (D,E)). Could each belief in the set {A,B,C,D,E} be justified from the other beliefs in the set in a way which is consistent with the laws of probability theory? (Whether or not any actual human holds actual beliefs in such a relation is another matter.)

To do: Find numerical values for P(A), P(B), P(C), P(D), and P(E) that allow each belief to be justified by the others, and that satisfy the laws of probability, or show that no such set of values exists.

leave a comment