cross entropy (Q1685498)

From Wikidata
Jump to navigation Jump to search
in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution
edit
Language Label Description Also known as
default for all languages
No label defined
    English
    cross entropy
    in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution

      Statements

      0 references
      0 references
      0 references
      0 references
      0 references

      Identifiers

       
      edit
        edit
          edit
            edit
              edit
                edit
                  edit
                    edit