cross entropy (Q1685498)
Jump to navigation
Jump to search
in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution
Language | Label | Description | Also known as |
---|---|---|---|
default for all languages | No label defined |
||
English | cross entropy |
in information theory, given two probability distributions, the average number of bits needed to identify an event if the coding scheme is optimized for the ‘wrong’ probability distribution rather than the true distribution |
Statements
0 references
Identifiers
1 reference
Sitelinks
Wikipedia(17 entries)
- arwiki أنتروبيا متقاطعة
- cawiki Entropia creuada
- cswiki Křížová entropie
- dewiki Kreuzentropie
- enwiki Cross-entropy
- eswiki Entropía cruzada
- fawiki آنتروپی متقاطع
- frwiki Entropie croisée
- itwiki Entropia incrociata
- jawiki 交差エントロピー
- kowiki 교차 엔트로피
- plwiki Entropia krzyżowa
- ptwiki Entropia cruzada
- ruwiki Перекрёстная энтропия
- ukwiki Перехресна ентропія
- zh_yuewiki 交叉熵
- zhwiki 交叉熵