• GamingChairModel@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    14 days ago

    Speaking which is conveying thought, also far exceed 10 bits per second.

    There was a study in 2019 that analyzed 17 different spoken languages to analyze how languages with lower complexity rate (bits of information per syllable) tend to be spoken faster in a way that information rate is roughly the same across spoken languages, at roughly 39 bits per second.

    Of course, it could be that the actual ideas and information in that speech is inefficiently encoded so that the actual bits of entropy are being communicated slower than 39 per second. I’m curious to know what the underlying Caltech paper linked says about language processing, since the press release describes deriving the 10 bits from studies analyzing how people read and write (as well as studies of people playing video games or solving Rubik’s cubes). Are they including the additional overhead of processing that information into new knowledge or insights? Are they defining the entropy of human language with a higher implied compression ratio?

    • Buffalox@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      with an assumed entropy of about 5 bits per English word. A 120 wpm typing speed therefore translates to 600 bits per minute, or 10 bits per second. A 160 wpm speaking speed translates to 13 bits/s.

      The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information. Normally a bit can only have 2 values, here they are talking about very different types of bits, which AFAIK is not a specific quantity.

      the human brain tends to trick itself into perceiving a much higher complexity that it is actually processing

      This is of course a thing.

      • GamingChairModel@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        14 days ago

        The problem here is that the bits of information needs to be clearly defined, otherwise we are not talking about actually quantifiable information

        here they are talking about very different types of bits

        I think everyone agrees on the definition of a bit (a binary two-value variable), but the active area of debate is which pieces of information actually matter. If information can be losslessly compressed into smaller representations of that same information, then the smaller compressed size represents the informational complexity in bits.

        The paper itself describes the information that can be recorded but ultimately discarded as not relevant: for typing, the forcefulness of each key press or duration of each key press don’t matter (but that exact same data might matter for analyzing someone playing the piano). So in terms of complexity theory, they’ve settled on 5 bits per English word and just refer to other prior papers that have attempted to quantify the information complexity of English.

    • RustyEarthfire@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      14 days ago

      Thanks for the link and breakdown.

      It sounds like a better description of the estimated thinking speed would be 5-50 bits per second. And when summarizing capacity/capability, one generally uses a number near the top end. It makes far more sense to say we are capable of 50 bps but often use less, than to say we are only capable of 10 but sometimes do more than we are capable of doing. And the paper leans hard into 10 bps being a internally imposed limit rather than conditional, going as far as saying a neural-computer interface would be limited to this rate.

      “Thinking speed” is also a poor description for input/output measurement, akin to calling a monitor’s bitrate the computer’s FLOPS.

      Visual processing is multi-faceted. I definitely don’t think all of vision can be reduced to 50bps, but maybe the serial part after the parallel bits have done stuff like detecting lines, arcs, textures, areas of contrast, etc.