Quantcast
Channel: What is information? - Philosophy Stack Exchange
Viewing all articles
Browse latest Browse all 15

What is information?

$
0
0

I am fascinated with information theory, as put together by Claude Shannon in the 1940s. It is amazing to me that this concept arose from analysing letters in the alphabet and then was later abstracted to black holes. But what I find lacking is the definition of what information actually is.

Wikipedia's page on information theory gives me this very early on:

A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process.

It seems to me that the definition of information already moves the goalpost further at a very early point. It establishes that information has to do with entropy: this other thing. The very first line on Wikipedia's page about information itself states:

Information can be thought of as the resolution of uncertainty; [...]

The "can be thought of" there throws me off a bit.

From the comments on this other PSE question I got that information theory's entropy correlates with entropy in thermodynamics. Which is great, but doesn't define what information is, just that it abides by a similar law than thermodynamic systems.

Another comment stated that information is physical (leading to this empty Wikipedia page) and that it may be analogous to energy (SEP).

So it seems to me that information and energy are related concepts. From Wikipedia's page on energy we get this very precise, very physical definition of what it is:

In physics, energy is the quantitative property that must be transferred to a body or physical system to perform work on the object, or to heat it. Energy is a conserved quantity; the law of conservation of energy states that energy can be converted in form, but not created or destroyed.

However, another PSE answer states something that seems contradictory:

Information is a non-physical concept, [...]

Thus it sounds like we have a pretty good grasp of what energy is and that it is affected by entropy. However, we do not seem to have defined information except for how it is also affected by entropy.

Entropy is most visible when a system changes from one state to another. Entropy in information theory also arrises when something is communicated from a sender to a receiver. But that sounds to me like defining water as "the thing that goes through a pipe".

Thus the question is: what is information?

Is it a quantity, like the number "2" in "the two apples on the table"? Or is it a quality, like the roundness and sweetness of the fruit that makes it an apple? Or is it the apple itself (either as a Kantian "apple-in-itself", inaccessible to us, or a particular approximation of the apple)?

A follow-up question then is: is there more contemporary work on defining it?

Moreover: shouldn't there be now a Philosophy of Information as a field of enquiry?


Edit 1:

It has been noted in answers and the comments that defining information, or defining energy is fruitless. To summarise and quote the argument: “the more we investigate nature, the more we fail to get anything but abstract math.”

It has also been discussed the correlation between Shannon entropy and Boltzmann entropy, the latter arising from the transformation of a thermodynamic system from state A to B and the correlation between all the micro and macro states of the system in states A and B.

So perhaps a more refined question would then be: if Boltzmann entropy happens when heat or pressure is transformed in a thermodynamic system, what is being transformed when Shannon entropy arises?


Edit 2:

Just to reiterate, I'm not looking for the meaning of the word "information". I'm looking for the phenomenological study on information as "a thing" that exists in the universe.

It has also been suggested that it is the reduction of uncertainty in a symbolic system. Examples were given using a deck of cards or dice to illustrate the point, and it has been raised that the uncertainty in those systems is subjective. If we don't know the sequence of the cards on the deck, there's more uncertainty there. However, this is too narrow an approach. Say I came from a planet where we store decks of cards in the precise order that earthlings call random. I would then have more information about the deck than the earthling, which shows that information is subjective. But it is only subjective because decks of cards and dice are things that earthlings make!

Contrast that with the debate whether information is lost at the event horizon of a black hole. Is that information subjective too? The no-hair theorem postulates that only mass, electrical charge and angular momentum is preserved when a body falls into a black hole's event horizon. Is angular momentum subjective too?

So it seems to me that information is not subjective. It is subjective when we apply it to things that are particular to us. But there's strong indication that it is a "thing" that "happens" in spite of us. What is this "thing"?

I think this is a question worthy of philosophical exploration.


Note 1: I'm not looking for a semantic definition of information, I'm looking for the epistemological definition of the concept.

Note 2: I'm a long-time lurker, but first-time asker, and not a trained philosopher, so please correct any mistakes in my question.


Viewing all articles
Browse latest Browse all 15

Latest Images

Trending Articles



Latest Images

<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>
<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596344.js" async> </script>