• 8 Posts
  • 66 Comments
Joined 1 year ago
cake
Cake day: December 18th, 2023

help-circle




  • Not quite. Information always depends on context. It is not a fundamental physical quantity like energy. When you have a piece of paper with english writing on it, then you can read and understand it. If you don’t know the script or language, you won’t even be able to tell if it’s a script or language at all. Some information needs to be in your head already. That’s simply how information works.

    You take in information through the senses and do something based on that information. Information flows into your brain through your senses and then out again in the form of behavior. The throughput is throttled to something on the order of 10 bits/s. When you think about it for a bit, you realize that a lot of things are predicated on that. Think of a video game controller. There’s only a few buttons. The interface between you and the game has a bandwidth of only a few bits.


  • Cn y ndrstnd this?

    You probably can. Human language has built-in redundancy. You can drop parts and still be understood. That’s useful for communicating in noisy environments or with people hard of hearing. So you could say that actual information content is less than 1 letter per letter, so to say.

    Properly, information content is measured in bits. A more detailed analysis gives a value of about 1 bit per character.

    Sidenote: You shouldn’t ask technical or scientific questions in this community. I don’t know why information theory denialism is so hot right now, but it obviously is.


  • The Bit is a unit of information, like the Liter is a unit of volume. The Bit may also be called Shannon, though I do not know where that is commonly done.

    When people talk about a liter, they are often thinking about a liter of milk, or gas. That is, they are thinking of a quantity of a certain substance rather than of a volume in the abstract. People may also say liter to mean a specific carton or bottle, even if it no longer contains a liter of whatever.

    Similarly, people will say “bit” when they mean something more specific than just a unit of measurement. For example, the least significant bit, or the parity bit, and so on. It may refer to a lot of things that can contain 1 Bit of information.

    The fact that the headline is talking about bits/s makes clear that this is talking about how much information goes through a human mind per time unit.




  • In some contexts, a bit can refer to a boolean variable, a flag. In other contexts, it may refer to the voltage at a certain point, or any number of other things. But when you are talking about bits/s then it’s a measure of information.

    These are not the same thing because the amount of information contained in a bit is not always equal to one bit of entropy.

    Yes, but as you know, this implies that the information is already available. You can use that knowledge to create a compression algorithm, or to define a less redundant file format. That’s very practical.

    We can also be a bit philosophical and ask: How much information does a backup contain? The answer could be: By definition, 0 bits. That’s not a useful answer, which implies a problem with the application of the definition.

    A more interesting question might be: How much information does a file contain, that stores the first 1 million digits of the number π?






  • In all likelihood, it wouldn’t be enforced against small fry, but it may be. The fact is that running something like that costs time and money. If what you get in return is legal threats and more demands, maybe you just don’t bother anymore.

    What kind of future is there for flouting these laws? GDPR and Article 13 are rarely enforced against small fry, but they are. And all I see, even on lemmy, are demands for more laws and more enforcement. When somebody gets busted, it’s their own bloody fault for having ignored the law for years and being so recalcitrant.