This document defines a set of WebIDL objects that allow access to the statistical
information about a RTCPeerConnection.
These objects are returned from the getStats API that is specified in [WEBRTC].
Status of This Document
This section describes the status of this document at the time of its publication. Other documents may supersede this document. A list of current W3C publications and the latest revision of this technical report can be found in the W3C technical reports index at https://www.w3.org/TR/.
Since the previous publication as a Candidate Recommendation, the stats objects were significantly reorganized to better match the underlying data sources. In addition, the networkType property was deprecated for preserving privacy, and the statsended event was removed as no longer needed.
W3C publishes a Candidate Recommendation to indicate that the document is believed to be
stable and to encourage implementation by the developer community. This Candidate
Recommendation is expected to advance to Proposed Recommendation no earlier than
21 February 2020.
Publication as a Candidate Recommendation does not imply endorsement by the
W3C Membership. This is a draft document and may be updated, replaced or
obsoleted by other documents at any time. It is inappropriate to cite this
document as other than work in progress.
This document was produced by a group
operating under the
W3C Patent Policy.
W3C maintains a
public list of any patent disclosures
made in connection with the deliverables of
the group; that page also includes
instructions for disclosing a patent. An individual who has actual
knowledge of a patent which the individual believes contains
Essential Claim(s)
must disclose the information in accordance with
section 6 of the W3C Patent Policy.
Audio, video, or data packets transmitted over a peer-connection can be lost, and
experience varying amounts of network delay. A web application implementing WebRTC expects
to monitor the performance of the underlying network and media pipeline.
This document defines the statistic identifiers used by the web application to extract
metrics from the user agent.
2. Conformance
As well as sections marked as non-normative, all authoring guidelines,
diagrams, examples, and notes in this specification are non-normative.
Everything else in this specification is normative.
The key words MAY, MUST, and MUST NOT in this document
are to be interpreted as described in
BCP 14
[RFC2119]
[RFC8174] when, and only when, they appear
in all capitals, as shown here.
This specification defines the conformance criteria that applies to a single product: the
user agent.
Implementations that use ECMAScript to implement the objects defined in this specification
MUST implement them in a manner consistent with the
ECMAScript Bindings defined in the Web IDL specification [WEBIDL], as this document uses
that specification and terminology.
This specification does not define what objects a conforming implementation should
generate. Specifications that refer to this specification have the need to specify
conformance. They should put in their document text that could like this (EXAMPLE ONLY):
An implementation MUST support generating statistics for the type
RTCInboundRtpStreamStats, with attributes packetsReceived, bytesReceived, packetsLost, and
jitter.
It MUST support generating statistics for the type RTCOutboundRtpStreamStats, with
attributes packetsSent, bytesSent.
For all subclasses of RTCRtpStreamStats, it MUST include ssrc and kind. When stats
exist for both sides of a connection, in the form of an inbound-rtp / remote-outbound-rtp
pair or an outbound-rtp / remote-inbound-rtp pair, the members remoteId and localId MUST
also be present.
It MAY support generating other stats.
3.
Terminology
The terms MediaStream, MediaStreamTrack, and Consumer are
defined in [GETUSERMEDIA].
The terms RTCPeerConnection, RTCDataChannel,
RTCDtlsTransport, RTCDtlsTransportState, RTCIceTransport,
RTCIceRole, RTCSctpTransport, RTCDataChannelState,
RTCIceCandidateType and RTCPriorityType are defined in [WEBRTC].
The term RTP stream is defined in [RFC7656] section 2.1.10.
The basic object of the stats model is the stats object. The following terms are
defined to describe it:
Monitored object
An internal object that keeps a set of data values. Most monitored objects are object
defined in the WebRTC API; they may be thought of as being internal properties of those
objects.
Stats object
This is a set of values, copied out from a monitored object at a specific moment in time.
It is returned as a WebIDL dictionary through the getStats API call.
Stats object reference
A monitored object has a stable identifier "id", which is reflected in all stats
objects produced from the monitored object. Stats objects may contain references to
other stats objects using this "id" value. In a stats object, these references
are represented by a DOMString containing "id" value of the referenced stats object.
All stats object references have type DOMString and attribute names ending in 'Id', or
they have type sequence<DOMString> and attribute names ending in 'Ids'.
Stats value
Refers to a single value within a stats object.
A monitored object changes the values it contains continuously over its lifetime, but is
never visible through the getStats API call. A stats object, once returned, never changes.
The stats API is defined in [WEBRTC]. It is defined to return a collection of stats
objects, each of which is a dictionary inheriting directly or indirectly from the
RTCStats dictionary. This API is normatively defined in [WEBRTC], but is
reproduced here for ease of reference.
When introducing a new stats object, the following principles should be followed:
An RTCStats object should correspond to an object defined in the specification it
supports.
The object MUST define a new value in the RTCStatsType enum, and MUST define the
syntax of the stats object it returns either by reference to an existing
sub-dictionary of RTCStats or by defining a new sub-dictionary of RTCStats.
All members of the new object need to have definitions that make them consistently
implementable. References to other specifications are a good way of doing this.
All members need to have defined behavior for what happens before the thing it counts
happens, or when the information it's supposed to show is not available. Usually, this
will be "start at zero" or "do not populate the value".
The new members of the stats dictionary need to be named according to standard practice
(camelCase), as per [API-DESIGN-PRINCIPLES].
Names ending in "Id" (such as "transportId") are always a stats object reference;
names ending in "Ids" (such as "trackIds") are always of type sequence<DOMString>,
where each DOMString is a stats object reference.
If the natural name for a stats value would end in "id" (such as when the stats value is
an in-protocol identifier for the monitored object), the recommended practice is to let
the name end in "identifier", such as "ssrcIdentifier" or "dataChannelIdentifier".
Stats are sampled by Javascript. In general, an application will not have overall control
over how often stats are sampled, and the implementation cannot know what the intended
use of the stats is. There is, by design, no control surface for the application to
influence how stats are generated.
Therefore, letting the implementation compute "average" rates is not a good idea, since
that implies some averaging time interval that can't be set beforehand. Instead, the
recommended approach is to count the number of measurements of a value and sum the
measurements given even if the sum is meaningless in itself; the JS application can then
compute averages over any desired time interval by calling getStats() twice, taking the
difference of the two sums and dividing by the difference of the two counts.
For stats that are measured against time, such as byte counts, no separate counter is
needed; one can instead divide by the difference in the timestamps.
4.2
Guidelines for implementing stats objects
When implementing stats objects, the following guidelines should be adhered to:
When a feature is not implemented on the platform, omit the dictionary member that is
tracking usage of the feature.
When a feature is not applicable to an instance of an object (for example audioLevel
on a video stream), omit the dictionary member. Do NOT report a count of zero, -1 or
"empty string".
When a counted feature hasn't been used yet, but may happen in the future, report a
count of zero.
4.3
Lifetime considerations for monitored objects
The object descriptions will say what the lifetime of a monitored object from the
perspective of stats is. When a monitored object is deleted, it no longer appears in
stats; until this happens, it will appear. This may or may not correspond to the actual
lifetime of an object in an implementation; what matters for this specification is what
appears in stats.
If a monitored object can only exist in a few instances over the lifetime of a
RTCPeerConnection, it may be simplest to consider it "eternal" and never delete it from
the set of objects reported on in stats. This type of object will remain visible until
the RTCPeerConnection is no longer available; it is also visible in getStats() after
pc.close(). This is the default when no lifetime is mentioned in its specification.
Objects that might exist in many instances over time should have a defined time at which
they are deleted, at which time they stop appearing in subsequent calls to getStats().
When an object is deleted, we can guarantee that no subsequent getStats() call will
contain a stats object reference that references the deleted object. We also
guarantee that the stats id of the deleted object will never be reused for another
object. This ensures that an application that collects stats objects for deleted
monitored objects will always be able to uniquely identify the object pointed to
in the result of any getStats() call.
4.4
Guidelines for getStats() results caching/throttling
A call to getStats() touches many components of WebRTC and may take significant time to
execute. The implementation may or may not utilize caching or throttling of getStats()
calls for performance benefits, however any implementation must adhere to the following:
When the state of the RTCPeerConnection visibly changes as a result of an API call, a
promise resolving or an event firing, subsequent new getStats() calls must return
up-to-date dictionaries for the affected objects. For example, if a track is added with
addTrack() subsequent getStats() calls must resolve with a corresponding
RTCMediaHandlerStats object. If you call setRemoteDescription() removing a remote track,
upon the promise resolving or an associated event (stream's onremovetrack or track's
onmute) firing, calling getStats() must resolve with an up-to-date RTCMediaHandlerStats
object.
When a stats object is deleted, subsequent getStats() calls MUST NOT return stats
for that monitored object.
5.
Maintenance procedures for stats object types
5.1
Adding new stats objects
This document specifies the interoperable stats object types. Proposals for new object
types may be made in the editors draft
maintained on GitHub. New standard types may appear in future revisions of the W3C
Recommendation.
If a need for a new stats object type or stats value within a stats object is found, an
issue should be raised on
Github, and a review process will decide on whether the stat should be added to the
editors draft or not.
A pull request for a change to the editors draft may serve as guidance for the
discussion, but the eventual merge is dependent on the review process.
While the WebRTC WG exists, it will serve as the review body; once it has disbanded, the
W3C will have to establish appropriate review.
The level of review sought is that of the IETF process' "expert review", as defined in
[RFC5226] section 4.1. The documentation needed includes the names of the new stats,
their data types, and the definitions they are based on, specified to a level that allows
interoperable implementation. The specification may consist of references to other
documents.
Another specification that wishes to refer to a specific version (for instance for
conformance) should refer to a dated version; these will be produced regularly when
updates happen.
5.2
Retiring stats objects
At times, it makes sense to retire the definition for a stats object or a stats value.
When this happens, it is not advisable to simply delete it from the spec, since there may
be implementations out there that use it, and it is important that the name is reserved
from re-use for another, incompatible definition.
Therefore, retired stats objects are moved to a separate section in this document.
Retired stats objects are moved there in their entirety; retired stats values are moved
to a "partial dictionary".
If there is no evidence that the retired object definition has ever been used (such as an
object that is added to the spec and renamed, redefined or removed prior to
implementation), the editors can decide to just remove the object from the spec.
6.
RTCStatsType
The type element, of type RTCStatsType, indicates the type of the
object that the RTCStats object represents. An object with a given "type" can
have only one IDL dictionary type, but multiple "type" values may indicate the same IDL
dictionary type; for example, "local-candidate" and "remote-candidate" both use the IDL
dictionary type RTCIceCandidateStats.
This specification is normative for the allowed values of RTCStatsType.
The following strings are valid values for RTCStatsType:
codec
Statistics for a codec that is currently being used by RTP streams being sent or
received by this RTCPeerConnection object. It is accessed by the
RTCCodecStats.
inbound-rtp
Statistics for an inbound RTP stream that is currently received with this
RTCPeerConnection object. It is accessed by the
RTCInboundRtpStreamStats.
outbound-rtp
Statistics for an outbound RTP stream that is currently sent with this
RTCPeerConnection object. It is accessed by the
RTCOutboundRtpStreamStats.
Statistics for the remote endpoint's inbound RTP stream corresponding to an outbound
stream that is currently sent with this RTCPeerConnection object. It is
measured at the remote endpoint and reported in an RTCP Receiver Report (RR) or RTCP
Extended Report (XR). It is accessed by the
RTCRemoteInboundRtpStreamStats.
remote-outbound-rtp
Statistics for the remote endpoint's outbound RTP stream corresponding to an inbound
stream that is currently received with this RTCPeerConnection object. It
is measured at the remote endpoint and reported in an RTCP Sender Report (SR). It is
accessed by the RTCRemoteOutboundRtpStreamStats.
media-source
Statistics for the media produced by a MediaStreamTrack that is
currently attached to an RTCRtpSender. This reflects the media that is
fed to the encoder; after getUserMedia() constraints have been applied
(i.e. not the raw media produced by the camera). It is either an
RTCAudioSourceStats or RTCVideoSourceStats
depending on its kind.
csrc
Statistics for a contributing source (CSRC) that contributed to an inbound RTP
stream. It is accessed by the RTCRtpContributingSourceStats.
peer-connection
Statistics related to the RTCPeerConnection object. It is accessed by
the RTCPeerConnectionStats.
data-channel
Statistics related to each RTCDataChannel id. It is accessed by the
RTCDataChannelStats.
stream
This is now obsolete. Contains statistics related to a specific
MediaStream. It is accessed by the obsolete dictionary
RTCMediaStreamStats.
The monitored "track" object is deleted when the sender it reports on has its "track"
value changed to no longer refer to the same track.
transceiver
Statistics related to a specific RTCRtpTransceiver. It is accessed
by the RTCRtpTransceiverStats dictionary.
sender
Statistics related to a specific RTCRtpSender and the corresponding
media-level metrics. It is accessed by the RTCAudioSenderStats or
the RTCVideoSenderStats depending on kind.
receiver
Statistics related to a specific receiver and the corresponding media-level
metrics. It is accessed by the RTCAudioReceiverStats or the
RTCVideoReceiverStats depending on kind.
transport
Transport statistics related to the RTCPeerConnection object. It is
accessed by the RTCTransportStats.
sctp-transport
SCTP transport statistics related to an
RTCSctpTransport object. It is accessed by the
RTCSctpTransportStats dictionary.
candidate-pair
ICE candidate pair statistics related to the RTCIceTransport objects. It
is accessed by the RTCIceCandidatePairStats.
A candidate pair that is not the current pair for a transport is deleted when the
RTCIceTransport does an ICE restart, at the time the state changes to "new". The
candidate pair that is the current pair for a transport is deleted after an ICE
restart when the RTCIceTransport switches to using a candidate pair generated from
the new candidates; this time doesn't correspond to any other externally observable
event.
local-candidate
ICE local candidate statistics related to the RTCIceTransport objects.
It is accessed by the RTCIceCandidateStats for the local
candidate.
A local candidate is deleted when the RTCIceTransport does an ICE restart, and the
candidate is no longer a member of any non-deleted candidate pair.
remote-candidate
ICE remote candidate statistics related to the RTCIceTransport objects.
It is accessed by the RTCIceCandidateStats for the remote
candidate.
A remote candidate is deleted when the RTCIceTransport does an ICE restart, and the
candidate is no longer a member of any non-deleted candidate pair.
certificate
Information about a certificate used by an RTCIceTransport. It is accessed by the
RTCCertificateStats.
ice-server
Information about the connection to an ICE server (e.g. STUN or TURN). It is accessed by the
RTCIceServerStats.
7.
Stats dictionaries
7.1
The RTP statistics hierarchy
The dictionaries for RTP statistics are structured as a hierarchy, so that those stats
that make sense in many different contexts are represented just once in IDL.
The metrics exposed here correspond to local measurements and those reported by RTCP packets.
Compound RTCP packets contain multiple RTCP report blocks, such as Sender Report (SR) and
Receiver Report (RR). Wheras, a non-compound RTCP packets may contain just a single
RTCP SR or RR block.
The lifetime of all RTP monitored objects starts when the RTP stream is first
used: When the first RTP packet is sent or received on the SSRC it represents, or when
the first RTCP packet is sent or received that refers to the SSRC of the RTP stream.
RTCReceivedRtpStreamStats: Stats measured at the receiving end of an RTP
stream, known either because they're measured locally or transmitted via an RTCP
Receiver Report (RR) or Extended Report (XR) block.
RTCInboundRtpStreamStats: Stats that can only be measured at the local
receiving end of an RTP stream.
RTCRemoteInboundRtpStreamStats: Stats relevant to the remote receiving end
of an RTP stream - usually computed by combining local data with data received
via an RTCP RR or XR block.
RTCSentRtpStreamStats: Stats measured at the sending end of an RTP stream,
known either because they're measured locally or because they're received via RTCP,
usually in an RTCP Sender Report (SR).
The 32-bit unsigned integer value per [RFC3550] used to identify the source of
the stream of RTP packets that this stats object concerns.
kind of type DOMString
Either "audio" or "video". This MUST match the media
type part of the information in the corresponding codecType member of RTCCodecStats, and
MUST match the "kind" attribute of the related MediaStreamTrack.
transportId of type DOMString
It is a unique identifier that is associated to the object that was inspected to
produce the RTCTransportStats associated with this RTP stream.
codecId of type DOMString
It is a unique identifier that is associated to the object that was inspected to
produce the RTCCodecStats associated with this RTP stream.
Total number of RTP packets received for this SSRC. At the receiving endpoint,
this is calculated as defined in [RFC3550] section 6.4.1. At the sending
endpoint the packetsReceived can be calculated by subtracting the packets lost
from the expected Highest Sequence Number reported in the RTCP Sender Report as
discussed in Appendix A.3. in [RFC3550].
packetsLost of type long long
Total number of RTP packets lost for this SSRC. Calculated as defined in
[RFC3550] section 6.4.1. Note that because of how this is estimated, it can be
negative if more packets are received than sent.
jitter of type double
Packet Jitter measured in seconds for this SSRC. Calculated as defined in section
6.4.1. of [RFC3550].
packetsDiscarded of type unsigned long long
The cumulative number of RTP packets discarded by the jitter buffer due to late
or early-arrival, i.e., these packets are not played out. RTP packets discarded
due to packet duplication are not reported in this metric [XRBLOCK-STATS].
Calculated as defined in [RFC7002] section 3.2 and Appendix A.a.
packetsRepaired of type unsigned long long
The cumulative number of lost RTP packets repaired after applying an
error-resilience mechanism [XRBLOCK-STATS]. It is measured for the primary
source RTP packets and only counted for RTP packets that have no further chance
of repair. To clarify, the value is upper-bound to the cumulative number of lost
packets. Calculated as defined in [RFC7509] section 3.1 and Appendix A.b.
burstPacketsLost of type unsigned long long
The cumulative number of RTP packets lost during loss bursts, Appendix A (c) of
[RFC6958].
burstPacketsDiscarded of type unsigned long long
The cumulative number of RTP packets discarded during discard bursts, Appendix A
(b) of [RFC7003].
burstLossCount of type unsigned
long
The cumulative number of bursts of lost RTP packets, Appendix A (e) of
[RFC6958].
[RFC3611] recommends a Gmin (threshold) value of 16 for classifying a sequence
of packet losses or discards as a burst.
burstDiscardCount of type unsigned long
The cumulative number of bursts of discarded RTP packets, Appendix A (e) of
[RFC8015].
burstLossRate of type double
The fraction of RTP packets lost during bursts to the total number of RTP packets
expected in the bursts. As defined in Appendix A (a) of [RFC7004], however,
the actual value is reported without multiplying by 32768.
burstDiscardRate of type double
The fraction of RTP packets discarded during bursts to the total number of RTP
packets expected in bursts. As defined in Appendix A (e) of [RFC7004],
however, the actual value is reported without multiplying by 32768.
gapLossRate of type double
The fraction of RTP packets lost during the gap periods. Appendix A (b) of
[RFC7004], however, the actual value is reported without multiplying by 32768.
gapDiscardRate of type double
The fraction of RTP packets discarded during the gap periods. Appendix A (f) of
[RFC7004], however, the actual value is reported without multiplying by 32768.
framesDropped of type unsigned
long
Only valid for video. The total number of frames dropped prior to decode or dropped
because the frame missed its display deadline for this receiver's track. The measurement
begins when the receiver is created and is a cumulative metric as defined in
Appendix A (g) of [RFC7004].
partialFramesLost of type unsigned long
Only valid for video. The cumulative number of partial frames lost. The measurement
begins when the receiver is created and is a cumulative metric as defined in
Appendix A (j) of [RFC7004]. This metric is incremented when the frame is sent
to the decoder. If the partial frame is received and recovered via retransmission
or FEC before decoding, the framesReceived counter is incremented.
fullFramesLost of type unsigned
long
Only valid for video. The cumulative number of full frames lost. The measurement
begins when the receiver is created and is a cumulative metric as defined in
Appendix A (i) of [RFC7004].
7.5
RTCInboundRtpStreamStats dictionary
The RTCInboundRtpStreamStats dictionary represents the measurement metrics for
the incoming RTP media stream. The timestamp reported in the statistics object is the
time at which the data was sampled.
Only valid for video. It represents the total number of frames correctly decoded
for this RTP stream, i.e., frames that would be displayed if no frames are dropped.
keyFramesDecoded of type unsigned long
Only valid for video. It represents the total number of key frames, such as key
frames in VP8 [RFC6386] or IDR-frames in H.264 [RFC6184], successfully
decoded for this RTP media stream. This is a subset of
framesDecoded. framesDecoded - keyFramesDecoded gives
you the number of delta frames decoded.
frameWidth of type unsigned
long
Only valid for video. Represents the width of the last decoded frame. Before the
first frame is decoded this attribute is missing.
frameHeight of type unsigned
long
Only valid for video. Represents the height of the last decoded frame. Before
the first frame is decoded this attribute is missing.
frameBitDepth of type unsigned
long
Only valid for video. Represents the bit depth per pixel of the last decoded frame.
Typical values are 24, 30, or 36 bits.
Before the first frame is decoded this attribute is missing.
framesPerSecond of type double
Only valid for video. The number of decoded frames in the last second.
qpSum of type unsigned long
long
Only valid for video. The sum of the QP values of frames decoded by this
receiver. The count of frames is in framesDecoded.
The definition of QP value depends on the codec; for VP8, the QP value is the
value carried in the frame header as the syntax element "y_ac_qi", and defined in
[RFC6386] section 19.2. Its range is 0..127.
Note that the QP value is only an indication of quantizer values used; many
formats have ways to vary the quantizer value within the frame.
totalDecodeTime of type double
Total number of seconds that have been spent decoding the framesDecoded
frames of this stream. The average decode time can be calculated by dividing this
value with framesDecoded. The time it takes to decode one frame is the
time passed between feeding the decoder a frame and the decoder returning decoded
data for that frame.
Sum of the squared interframe delays in seconds between consecutively decoded frames,
recorded just after a frame has been decoded. See totalInterFrameDelay for
details on how to calculate the interframe delay variance.
voiceActivityFlag of type boolean
Only valid for audio. Whether the last RTP packet whose frame was delivered to the
RTCRtpReceiver's MediaStreamTrack for playout contained voice activity or not based
on the presence of the V bit in the extension header, as defined in [RFC6464]. This
is the stats-equivalent of RTCRtpSynchronizationSource.voiceActivityFlag
in [[WEBRTC].
lastPacketReceivedTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last packet was received for this SSRC.
This differs from timestamp, which represents the time at which the
statistics were generated by the local endpoint.
averageRtcpInterval of type double
The average RTCP interval between two consecutive compound RTCP packets. This is
calculated by the sending endpoint when sending compound RTCP reports. Compound
packets must contain at least a RTCP RR or SR block and an SDES packet with the
CNAME item.
headerBytesReceived of type unsigned long long
Total number of RTP header and padding bytes received for this SSRC. This does
not include the size of transport layer headers such as IP or UDP.
headerBytesReceived + bytesReceived equals the number of bytes
received as payload over the transport.
fecPacketsReceived of type unsigned long long
Total number of RTP FEC packets received for this SSRC. This counter can also be
incremented when receiving FEC packets in-band with media packets (e.g., with
Opus).
fecPacketsDiscarded of type unsigned long long
Total number of RTP FEC packets received for this SSRC where the error correction
payload was discarded by the application. This may happen 1. if all the source
packets protected by the FEC packet were received or already recovered by a
separate FEC packet, or 2. if the FEC packet arrived late, i.e., outside the
recovery window, and the lost RTP packets have already been skipped during
playout. This is a subset of fecPacketsReceived.
bytesReceived of type unsigned
long long
Total number of bytes received for this SSRC. Calculated as defined in
[RFC3550] section 6.4.1.
packetsFailedDecryption of type unsigned long long
The cumulative number of RTP packets that failed to be decrypted according to the
procedures in [RFC3711]. These packets are not counted by
packetsDiscarded.
packetsDuplicated of type unsigned long long
The cumulative number of packets discarded because they are duplicated. Duplicate
packets are not counted in packetsDiscarded.
Duplicated packets have the same RTP sequence number and content as a previously
received packet. If multiple duplicates of a packet are received, all of them are
counted.
An improved estimate of lost packets can be calculated by adding
packetsDuplicated to packetsLost; this will always result in a positive
number, but not the same number as RFC 3550 would calculate.
perDscpPacketsReceived of type record<USVString, unsigned long long>
Total number of packets received for this SSRC, per Differentiated Services code
point (DSCP) [RFC2474]. DSCPs are identified as decimal integers in string
form. Note that due to network remapping and bleaching, these numbers are not
expected to match the numbers seen on sending. Not all OSes make this information
available.
firCount of type unsigned
long
Only valid for video. Count the total number of Full Intra Request (FIR) packets
sent by this receiver. Calculated as defined in [RFC5104] section 4.3.1. and
does not use the metric indicated in [RFC2032], because it was deprecated by
[RFC4587].
pliCount of type unsigned
long
Only valid for video. Count the total number of Picture Loss Indication (PLI)
packets sent by this receiver. Calculated as defined in [RFC4585] section
6.3.1.
nackCount of type unsigned
long
Count the total number of Negative ACKnowledgement (NACK) packets sent by this
receiver. Calculated as defined in [RFC4585] section 6.2.1.
sliCount of type unsigned
long
Only valid for video. Count the total number of Slice Loss Indication (SLI)
packets sent by this receiver. Calculated as defined in [RFC4585] section
6.3.2.
estimatedPlayoutTimestamp of type DOMHighResTimeStamp
This is the estimated playout time of this receiver's track. The playout time is
the NTP timestamp of the last playable audio sample or video frame that has a known
timestamp (from an RTCP SR packet mapping RTP timestamps to NTP timestamps),
extrapolated with the time elapsed since it was ready to be played out. This is
the "current time" of the track in NTP clock time of the sender and can be present
even if there is no audio currently playing.
This can be useful for estimating how much audio and video is out of sync for two
tracks from the same source, audioTrackStats.estimatedPlayoutTimestamp -
videoTrackStats.estimatedPlayoutTimestamp.
jitterBufferDelay of type double
It is the sum of the time, in seconds, each audio sample or video frame takes from
the time it is received and to the time it exits the jitter buffer. This increases
upon samples or frames exiting, having completed their time in the buffer (and
incrementing jitterBufferEmittedCount). The average jitter buffer
delay can be calculated by dividing the jitterBufferDelay with the
jitterBufferEmittedCount.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
jitterBufferEmittedCount of type unsigned long long
The total number of audio samples or video frames that have come out of the
jitter buffer (increasing jitterBufferDelay).
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
totalSamplesReceived of type unsigned long long
Only valid for audio. The total number of samples that have been received on this
RTP stream. This includes concealedSamples.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
samplesDecodedWithSilk of type unsigned long long
Only valid for audio and when the audio codec is Opus. The total number of
samples decoded by the SILK portion of the Opus codec.
samplesDecodedWithCelt of type unsigned long long
Only valid for audio and when the audio codec is Opus. The total number of
samples decoded by the CELT portion of the Opus codec.
concealedSamples of type unsigned long long
Only valid for audio. The total number of samples that are concealed samples. A
concealed sample is a sample that was replaced with synthesized samples generated
locally before being played out. Examples of samples that have to be concealed
are samples from lost packets (reported in packetsLost) or samples from packets that arrive
too late to be played out (reported in packetsDiscarded).
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
silentConcealedSamples of type unsigned long long
Only valid for audio. The total number of concealed samples inserted that are
"silent". Playing out silent samples results in silence or comfort noise. This is
a subset of concealedSamples.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
concealmentEvents of type unsigned long long
Only valid for audio. The number of concealment events. This counter increases every
time a concealed sample is synthesized after a non-concealed sample. That is, multiple
consecutive concealed samples will increase the concealedSamples count multiple
times but is a single concealment event.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
insertedSamplesForDeceleration of type unsigned long long
Only valid for audio. When playout is slowed down, this counter is increased by the
difference between the number of samples received and the number of samples played out.
If playout is slowed down by inserting samples, this will be the number of inserted
samples.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
removedSamplesForAcceleration of type unsigned long long
Only valid for audio. When playout is sped up, this counter is increased by the
difference between the number of samples received and the number of samples played
out. If speedup is achieved by removing samples, this will be the count of samples
removed.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
audioLevel of type double
Only valid for audio. Represents the audio level of the receiving track. For audio
levels of tracks attached locally, see RTCAudioSourceStats
instead.
The value is between 0..1 (linear), where 1.0 represents 0 dBov, 0 represents
silence, and 0.5 represents approximately 6 dBSPL change in the sound pressure
level from 0 dBov.
The audioLevel is averaged over some small interval, using the algortihm
described under totalAudioEnergy. The interval used is implementation
dependent.
totalAudioEnergy of type double
Only valid for audio. Represents the audio energy of the receiving track. For
audio energy of tracks attached locally, see
RTCAudioSourceStats instead.
This value MUST be computed as follows: for each audio sample that is received
(and thus counted by totalSamplesReceived), add the sample's
value divided by the highest-intensity encodable value, squared and then
multiplied by the duration of the sample in seconds. In other words,
duration * Math.pow(energy/maxEnergy, 2).
This can be used to obtain a root mean square (RMS) value that uses the same
units as audioLevel, as defined in [RFC6464]. It can be
converted to these units using the formula
Math.sqrt(totalAudioEnergy/totalSamplesDuration). This calculation
can also be performed using the differences between the values of two different
getStats() calls, in order to compute the average audio level over
any desired time interval. In other words, do Math.sqrt((energy2 -
energy1)/(duration2 - duration1)).
For example, if a 10ms packet of audio is produced with an RMS of 0.5 (out of
1.0), this should add 0.5 * 0.5 * 0.01 = 0.0025 to
totalAudioEnergy. If another 10ms packet with an RMS of 0.1 is
received, this should similarly add 0.0001 to
totalAudioEnergy. Then,
Math.sqrt(totalAudioEnergy/totalSamplesDuration) becomes
Math.sqrt(0.0026/0.02) = 0.36, which is the same value that would be
obtained by doing an RMS calculation over the contiguous 20ms segment of audio.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample. The
"audio energy of a sample" refers to the highest energy of any
channel.
totalSamplesDuration of type double
Only valid for audio. Represents the audio duration of the receiving track. For
audio durations of tracks attached locally, see
RTCAudioSourceStats instead.
Represents the total duration in seconds of all samples that have been received
(and thus counted by totalSamplesReceived). Can be used with
totalAudioEnergy to compute an average audio level over
different intervals.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
framesReceived of type unsigned
long
Only valid for video. Represents the total number of complete frames received on
this RTP stream. This metric is incremented when the complete frame is received.
decoderImplementation of type DOMString
Identifies the decoder implementation used. This is useful for diagnosing
interoperability issues.
If too much information is given here, it increases the fingerprint surface.
Since it is only given for active tracks, the incremental exposure is small.
7.6
RTCRemoteInboundRtpStreamStats dictionary
The RTCRemoteInboundRtpStreamStats dictionary represents the remote endpoint's
measurement metrics for a particular incoming RTP stream (corresponding to an outgoing
RTP stream at the sending endpoint). The timestamp reported in the statistics object is
the time at which the corresponding RTCP RR was received.
Estimated round trip time for this SSRC based on the RTCP timestamps in the RTCP
Receiver Report (RR) and measured in seconds. Calculated as defined in section
6.4.1. of [RFC3550]. If no RTCP Receiver Report is received with a DLSR value
other than 0, the round trip time is left undefined.
totalRoundTripTime of type double
Represents the cumulative sum of all round trip time measurements in seconds
since the beginning of the session. The individual round trip time is calculated
based on the RTCP timestamps in the RTCP Receiver Report (RR) [RFC3550], hence
undefined roundtrip are not added. The average round trip time can be computed
from totalRoundTripTime by dividing it by
roundTripTimeMeasurements.
fractionLost of type double
The fraction packet loss reported for this SSRC. Calculated as defined in
[RFC3550] section 6.4.1 and Appendix A.3.
reportsReceived of type unsigned long long
Represents the total number of RTCP RR blocks received for this SSRC.
roundTripTimeMeasurements of type unsigned long long
Represents the total number of RTCP RR blocks received for this SSRC that contain
a valid round trip time. This counter will increment if the roundTripTime is undefined.
Total number of RTP packets sent for this SSRC. Calculated as defined in
[RFC3550] section 6.4.1.
bytesSent of type unsigned long
long
Total number of bytes sent for this SSRC. Calculated as defined in [RFC3550]
section 6.4.1.
7.8
RTCOutboundRtpStreamStats dictionary
The RTCOutboundRtpStreamStats dictionary represents the measurement metrics
for the outgoing RTP stream. The timestamp reported in the statistics object is the time
at which the data was sampled.
Exposes the rid
encoding parameter of this RTP stream if it has been set, otherwise it is
undefined. If set this value will be present regardless if the RID RTP header
extension has been negotiated.
lastPacketSentTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last packet was sent for this SSRC. This
differs from timestamp, which represents the time at which the
statistics were generated by the local endpoint.
headerBytesSent of type unsigned long long
Total number of RTP header and padding bytes sent for this SSRC. This does not
include the size of transport layer headers such as IP or UDP.
headerBytesSent + bytesSent equals the number of bytes sent as
payload over the transport.
packetsDiscardedOnSend of type unsigned long
Total number of RTP packets for this SSRC that have been discarded due to socket
errors, i.e. a socket error occured when handing the packets to the socket. This
might happen due to various reasons, including full buffer or no available
memory.
bytesDiscardedOnSend of type unsigned long long
Total number of bytes for this SSRC that have been discarded due to socket
errors, i.e. a socket error occured when handing the packets containing the bytes
to the socket. This might happen due to various reasons, including full buffer or
no available memory. Calculated as defined in [RFC3550] section 6.4.1.
fecPacketsSent of type unsigned
long
Total number of RTP FEC packets sent for this SSRC. This counter can also be
incremented when sending FEC packets in-band with media packets (e.g., with
Opus).
retransmittedPacketsSent of type unsigned long long
The total number of packets that were retransmitted for this SSRC. This is a
subset of packetsSent.
retransmittedBytesSent of type unsigned long long
The total number of bytes that were retransmitted for this SSRC, only including
payload bytes. This is a subset of bytesSent.
targetBitrate of type double
It is the current target bitrate configured for this particular SSRC and is the
Transport Independent Application Specific (TIAS) bitrate [RFC3890].
Typically, the target bitrate is a configuration parameter provided to the
codec's encoder and does not count the size of the IP or other transport layers
like TCP or UDP. It is measured in bits per second and the bitrate is calculated
over a 1 second window.
totalEncodedBytesTarget of type unsigned long long
This value is increased by the target frame size in bytes every time a frame has
been encoded. The actual frame size may be bigger or smaller than this number.
This value goes up every time framesEncoded goes up.
frameWidth of type unsigned
long
Only valid for video. Represents the width of the last encoded frame. The resolution
of the encoded frame may be lower than the media source (see RTCVideoSourceStats.width).
Before the first frame is encoded this attribute is missing.
frameHeight of type unsigned
long
Only valid for video. Represents the height of the last encoded frame. The resolution
of the encoded frame may be lower than the media source (see RTCVideoSourceStats.height).
Before the first frame is encoded this attribute is missing.
frameBitDepth of type unsigned
long
Only valid for video. Represents the bit depth per pixel of the last encoded frame.
Typical values are 24, 30, or 36 bits.
Before the first frame is encoded this attribute is missing.
framesPerSecond of type double
Only valid for video. The number of encoded frames during the last second. This may be
lower than the media source frame rate (see RTCVideoSourceStats.framesPerSecond).
framesSent of type unsigned
long
Only valid for video. Represents the total number of frames sent on this RTP stream.
hugeFramesSent of type unsigned
long
Only valid for video. Represents the total number of huge frames sent by this RTP
stream. Huge frames, by definition, are frames that have an encoded size at least
2.5 times the average size of the frames. The average size of the frames is defined
as the target bitrate per second divided by the target FPS at the time the frame was
encoded. These are usually complex to encode frames with a lot of changes in the
picture. This can be used to estimate, e.g slide changes in the streamed presentation.
The multiplier of 2.5 is choosen from analyzing encoded frame sizes for a sample
presentation using WebRTC standalone implementation. 2.5 is a reasonably large
multiplier which still caused all slide change events to be identified as a huge
frames. It, however, produced 1.4% of false positive slide change detections
which is deemed reasonable.
framesEncoded of type long
Only valid for video. It represents the total number of frames successfully
encoded for this RTP media stream.
keyFramesEncoded of type unsigned long
Only valid for video. It represents the total number of key frames, such as key
frames in VP8 [RFC6386] or IDR-frames in H.264 [RFC6184], successfully
encoded for this RTP media stream. This is a subset of
framesEncoded. framesEncoded - keyFramesEncoded gives
you the number of delta frames encoded.
framesDiscardedOnSend of type unsigned long
Total number of video frames that have been discarded for this SSRC due to socket
errors, i.e. a socket error occured when handing the packets to the socket. This
might happen due to various reasons, including full buffer or no available
memory.
qpSum of type unsigned long
long
Only valid for video. The sum of the QP values of frames encoded by this sender.
The count of frames is in framesEncoded.
The definition of QP value depends on the codec; for VP8, the QP value is the
value carried in the frame header as the syntax element "y_ac_qi", and defined in
[RFC6386] section 19.2. Its range is 0..127.
Note that the QP value is only an indication of quantizer values used; many
formats have ways to vary the quantizer value within the frame.
totalSamplesSent of type unsigned long long
Only valid for audio. The total number of samples that have been sent over this
RTP stream.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
samplesEncodedWithSilk of type unsigned long long
Only valid for audio and when the audio codec is Opus. The total number of
samples encoded by the SILK portion of the Opus codec.
samplesEncodedWithCelt of type unsigned long long
Only valid for audio and when the audio codec is Opus. The total number of
samples encoded by the CELT portion of the Opus codec.
voiceActivityFlag of type boolean
Only valid for audio. Whether the last RTP packet sent contained voice activity
or not based on the presence of the V bit in the extension header, as defined in
[RFC6464].
totalEncodeTime of type double
Total number of seconds that has been spent encoding the framesEncoded
frames of this stream. The average encode time can be calculated by dividing this
value with framesEncoded. The time it takes to encode one frame is the
time passed between feeding the encoder a frame and the encoder returning encoded
data for that frame. This does not include any additional time it may take to
packetize the resulting data.
totalPacketSendDelay of type double
The total number of seconds that packets have spent buffered locally before being
transmitted onto the network. The time is measured from when a packet is emitted
from the RTP packetizer until it is handed over to the OS network socket. This
measurement is added to totalPacketSendDelay when
packetsSent is incremented.
averageRtcpInterval of type double
The average RTCP interval between two consecutive compound RTCP packets. This is
calculated by the sending endpoint when sending compound RTCP reports. Compound
packets must contain at least a RTCP RR or SR block and an SDES packet with the
CNAME item.
Only valid for video. The current reason for limiting the resolution and/or
framerate, or "none" if not limited.
qualityLimitationDurations of type record<DOMString, double>
Only valid for video. A record of the total time, in seconds, that this stream
has spent in each quality limitation state. The record includes a mapping for all
RTCQualityLimitationReason types, including "none".
The sum of all entries minus qualityLimidationDurations["none"]
gives the total time that the stream has been limited.
qualityLimitationResolutionChanges of type unsigned long
Only valid for video. The number of times that the resolution has changed because
we are quality limited (qualityLimitationReason has a value other than
"none"). The counter is initially zero and increases when the
resolution goes up or down. For example, if a 720p track is sent as 480p for some
time and then recovers to 720p, qualityLimitationResolutionChanges
will have the value 2.
perDscpPacketsSent of type record<USVString, unsigned long long>
Total number of packets sent for this SSRC, per DSCP. DSCPs are identified as
decimal integers in string form.
nackCount of type unsigned
long
Count the total number of Negative ACKnowledgement (NACK) packets received by
this sender. Calculated as defined in [RFC4585] section 6.2.1.
firCount of type unsigned
long
Only valid for video. Count the total number of Full Intra Request (FIR) packets
received by this sender. Calculated as defined in [RFC5104] section 4.3.1. and
does not use the metric indicated in [RFC2032], because it was deprecated by
[RFC4587].
pliCount of type unsigned
long
Only valid for video. Count the total number of Picture Loss Indication (PLI)
packets received by this sender. Calculated as defined in [RFC4585] section
6.3.1.
sliCount of type unsigned
long
Only valid for video. Count the total number of Slice Loss Indication (SLI)
packets received by this sender. Calculated as defined in [RFC4585] section
6.3.2.
encoderImplementation of type DOMString
Identifies the encoder implementation used. This is useful for diagnosing
interoperability issues.
If too much information is given here, it increases the fingerprint surface.
Since it is only given for active tracks, the incremental exposure is small.
The resolution and/or framerate is primarily limited due to CPU load.
bandwidth
The resolution and/or framerate is primarily limited due to congestion cues
during bandwidth estimation. Typical, congestion control algorithms use
inter-arrival time, round-trip time, packet or other congestion cues to perform
bandwidth estimation.
other
The resolution and/or framerate is primarily limited for a reason other than
the above.
7.10
RTCRemoteOutboundRtpStreamStats dictionary
The RTCRemoteOutboundRtpStreamStats dictionary represents the remote
endpoint's measurement metrics for its outgoing RTP stream (corresponding to an outgoing
RTP stream at the sending endpoint). The timestamp reported in the statistics object is
the time at which the corresponding RTCP SR was received.
remoteTimestamp, of type DOMHighResTimeStamp
[HIGHRES-TIME], represents the remote timestamp at which these statistics were
sent by the remote endpoint. This differs from timestamp, which
represents the time at which the statistics were generated or received by the
local endpoint. The remoteTimestamp, if present, is derived from the
NTP timestamp in an RTCP Sender Report (SR) block, which reflects the remote
endpoint's clock. That clock may not be synchronized with the local clock.
reportsSent of type unsigned long long
Represents the total number of RTCP SR blocks sent for this SSRC.
7.11
RTCMediaSourceStats dictionary
The RTCMediaSourceStats dictionary represents a track that is currently
attached to one or more senders. It contains information about media sources such as
frame rate and resolution prior to encoding. This is the media passed from the
MediaStreamTrack to the RTCRtpSenders. This is in contrast to
RTCOutboundRtpStreamStats whose members describe metrics as measured after
the encoding step. For example, a track may be captured from a high-resolution camera,
its frames downscaled due to track constraints and then further downscaled by the
encoders due to CPU and network conditions. This dictionary reflects the video frames or
audio samples passed out from the track - after track constraints have been applied but
before any encoding or further donwsampling occurs.
Media source objects are of either subdictionary RTCAudioSourceStats or
RTCVideoSourceStats. The type is the same
("media-source") but kind is different ("audio" or
"video") depending on the kind of track.
The media source stats objects are created when a track is attached to any
RTCRtpSender and may subsequently be attached to multiple senders during its
life. The life of this object ends when the track is no longer attached to any sender of
the same RTCPeerConnection. If a track whose media source object ended is
attached again this results in a new media source stats object whose counters (such as
number of frames) are reset.
The value of the MediaStreamTrack's kind attribute.
This is either "audio" or "video". If it is
"audio" then this stats object is of type
RTCAudioSourceStats. If it is "video" then this stats
object is of type RTCVideoSourceStats.
7.12
RTCAudioSourceStats dictionary
The RTCAudioSourceStats dictionary represents an audio track that is attached
to one or more senders. It is an RTCMediaSourceStats whose
kind is "audio".
Represents the audio level of the media source. For audio levels of remotely
sourced tracks, see RTCAudioReceiverStats instead.
The value is between 0..1 (linear), where 1.0 represents 0 dBov, 0 represents
silence, and 0.5 represents approximately 6 dBSPL change in the sound pressure
level from 0 dBov.
The audioLevel is averaged over some small interval, using the algortihm
described under totalAudioEnergy. The interval used is implementation
dependent.
totalAudioEnergy of type double
Represents the audio energy of the media source. For audio energy of remotely
sourced tracks, see RTCAudioReceiverStats instead.
This value MUST be computed as follows: for each audio sample produced by the
media source during the lifetime of this stats object, add the sample's value
divided by the highest-intensity encodable value, squared and then multiplied by
the duration of the sample in seconds. In other words, duration *
Math.pow(energy/maxEnergy, 2).
This can be used to obtain a root mean square (RMS) value that uses the same
units as audioLevel, as defined in [RFC6464]. It can be
converted to these units using the formula
Math.sqrt(totalAudioEnergy/totalSamplesDuration). This calculation
can also be performed using the differences between the values of two different
getStats() calls, in order to compute the average audio level over
any desired time interval. In other words, do Math.sqrt((energy2 -
energy1)/(duration2 - duration1)).
For example, if a 10ms packet of audio is produced with an RMS of 0.5 (out of
1.0), this should add 0.5 * 0.5 * 0.01 = 0.0025 to
totalAudioEnergy. If another 10ms packet with an RMS of 0.1 is
received, this should similarly add 0.0001 to
totalAudioEnergy. Then,
Math.sqrt(totalAudioEnergy/totalSamplesDuration) becomes
Math.sqrt(0.0026/0.02) = 0.36, which is the same value that would be
obtained by doing an RMS calculation over the contiguous 20ms segment of audio.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample. The
"audio energy of a sample" refers to the highest energy of any
channel.
totalSamplesDuration of type double
Represents the audio duration of the media source. For audio durations of
remotely sourced tracks, see RTCAudioReceiverStats instead.
Represents the total duration in seconds of all samples that have been produced
by this source for the lifetime of this stats object. Can be used with
totalAudioEnergy to compute an average audio level over
different intervals.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample.
echoReturnLoss of type double
Only present when the MediaStreamTrack is sourced from a microphone where
echo cancellation is applied. Calculated in decibels, as defined in [ECHO]
(2012) section 3.14.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample. When
calculating echo return loss, the channel of the least audio
energy is considered for any sample.
echoReturnLossEnhancement of type double
Only present when the MediaStreamTrack is sourced from a microphone where
echo cancellation is applied. Calculated in decibels, as defined in [ECHO]
(2012) section 3.15.
Note: If multiple audio channels are used, metrics based on
samples do not increment at a higher rate. "A sample" refers
to having a sample in either channel - simultaneously having
samples in multiple channels counts as a single sample. When
calculating echo return loss enhancement, the channel of the
least audio energy is considered for any sample.
7.13
RTCVideoSourceStats dictionary
The RTCVideoSourceStats dictionary represents a video track that is attached
to one or more senders. It is an RTCMediaSourceStats whose
kind is "video".
The width, in pixels, of the last frame originating from this source. Before a
frame has been produced this attribute is missing.
height of type unsigned
long
The height, in pixels, of the last frame originating from this source. Before a
frame has been produced this attribute is missing.
bitDepth of type unsigned
long
The bit depth per pixel of the last frame originating from this source. Before a
frame has been produced this attribute is missing.
frames of type frames
The total number of frames originating from this source.
framesPerSecond of type unsigned long
The number of frames originating from this source, measured during the last
second. For the first second of this object's lifetime this attribute is missing.
7.14
RTCRtpContributingSourceStats dictionary
The RTCRtpContributingSourceStats dictionary represents the measurement
metrics for a contributing source (CSRC) that is contributing to an incoming RTP stream.
Each contributing source produces a stream of RTP packets, which are combined by a mixer
into a single stream of RTP packets that is ultimately received by the WebRTC endpoint.
Information about the sources that contributed to this combined stream may be provided in
the CSRC list or [RFC6465] header extension of received RTP packets. The
timestamp of this stats object is the
most recent time an RTP packet the source contributed to was received and counted by
packetsContributedTo.
The SSRC identifier of the contributing source represented by this stats object,
as defined by [RFC3550]. It is a 32-bit unsigned integer that appears in the
CSRC list of any packets the relevant source contributed to.
inboundRtpStreamId of type DOMString
The ID of the RTCInboundRtpStreamStats object representing
the inbound RTP stream that this contributing source is contributing to.
packetsContributedTo of type unsigned long
The total number of RTP packets that this contributing source contributed to.
This value is incremented each time a packet is counted by
RTCInboundRtpStreamStats.packetsReceived, and the packet's CSRC
list (as defined by [RFC3550] section 5.1) contains the SSRC identifier of
this contributing source, contributorSsrc.
audioLevel of type double
Present if the last received RTP packet that this source contributed to contained
an [RFC6465] mixer-to-client audio level header extension. The value of
audioLevel is between 0..1 (linear), where 1.0 represents 0 dBov, 0
represents silence, and 0.5 represents approximately 6 dBSPL change in the sound
pressure level from 0 dBov.
The [RFC6465] header extension contains values in the range 0..127, in units
of -dBov, where 127 represents silence. To convert these values to the linear
0..1 range of audioLevel, a value of 127 is converted to 0, and all
other values are converted using the equation: f(rfc6465_level) =
10^(-rfc6465_level/20).
Represents the number of unique DataChannels that have entered the "open" state
during their lifetime.
dataChannelsClosed of type unsigned long
Represents the number of unique DataChannels that have left the "open" state
during their lifetime (due to being closed by either end or the underlying
transport being closed). DataChannels that transition from "connecting" to
"closing" or "closed" without ever being "open" are not counted in this number.
dataChannelsRequested of type unsigned long
Represents the number of unique DataChannels returned from a successful
createDataChannel() call on the RTCPeerConnection. If the underlying data
transport is not established, these may be in the "connecting" state.
dataChannelsAccepted of type unsigned long
Represents the number of unique DataChannels signaled in a "datachannel" event on
the RTCPeerConnection.
The total number of open data channels at any time can be calculated as
dataChannelsOpened - dataChannelsClosed. This number is always positive.
The sum of dataChannelsRequested and dataChannelsAccepted is always greater than or
equal to dataChannelsOpened - the difference is equal to the number of channels that
have been requested, but have not reached the "open" state.
7.16
RTCRtpTransceiverStats dictionary
An RTCRtpTransceiverStats stats object represents an RTCRtpTransceiver
of an RTCPeerConnection.
It appears as soon as the monitored RTCRtpTransceiver object is
created, such as by invoking addTransceiver, addTrack or
setRemoteDescription. RTCRtpTransceiverStats objects can only be
deleted if the corresponding RTCRtpTransceiver is removed - this can
only happen if a remote description is rolled back.
Only applicable for 'track' stats. True if the source is remote, for instance if it
is sourced from another host via an RTCPeerConnection. False otherwise.
ended of type boolean
Reflects the "ended" state of the track.
kind of type DOMString
Either "audio" or "video". This reflects the "kind"
attribute of the MediaStreamTrack, see [GETUSERMEDIA].
An RTCSenderVideoTrackAttachmentStats object represents the stats about one attachment of
a video MediaStreamTrack to the RTCPeerConnection object for which one calls getStats.
It appears in the stats as soon as it is attached (via addTrack, via addTransceiver, via
replaceTrack on an RTCRtpSender object).
If a video track is attached twice (via addTransceiver or replaceTrack), there will be
two RTCSenderVideoTrackAttachmentStats objects, one for each attachment. They will have
the same "trackIdentifier" attribute, but different "id" attributes.
An RTCSenderAudioTrackAttachmentStats object represents the stats about one attachment of
an audio MediaStreamTrack to the RTCPeerConnection object for which one calls getStats.
It appears in the stats as soon as it is attached (via addTrack, via addTransceiver, via
replaceTrack on an RTCRtpSender object).
If an audio track is attached twice (via addTransceiver or replaceTrack), there will be
two RTCSenderAudioTrackAttachmentStats objects, one for each attachment. They will have
the same "trackIdentifier" attribute, but different "id" attributes.
Represents the total number of API "message" events sent.
bytesSent of type unsigned long
long
Represents the total number of payload bytes sent on this
RTCDatachannel, i.e., not including headers or padding.
messagesReceived of type unsigned long
Represents the total number of API "message" events received.
bytesReceived of type unsigned
long long
Represents the total number of bytes received on this
RTCDatachannel, i.e., not including headers or padding.
7.27
RTCTransportStats dictionary
An RTCTransportStats object represents the stats corresponding to an
RTCDtlsTransport and its underlying
RTCIceTransport. When RTCP multiplexing is used, one transport is
used for both RTP and RTCP. Otherwise, RTP and RTCP will be sent on separate transports,
and rtcpTransportStatsId can be used to pair the resulting
RTCTransportStats objects. Additionally, when bundling is used, a single
transport will be used for all MediaStreamTracks in the bundle group.
If bundling is not used, different MediaStreamTrack will use
different transports. RTCP multiplexing and bundling are described in [WEBRTC].
Represents the total number of packets sent over this transport.
packetsReceived of type unsigned long long
Represents the total number of packets received on this transport.
bytesSent of type unsigned long
long
Represents the total number of payload bytes sent on this
PeerConnection, i.e., not including headers or padding.
bytesReceived of type unsigned
long long
Represents the total number of bytes received on this
PeerConnection, i.e., not including headers or padding.
rtcpTransportStatsId of type DOMString
If RTP and RTCP are not multiplexed, this is the id of the transport
that gives stats for the RTCP component, and this record has only the RTP
component stats.
Set to the current value of the "state" attribute of the underlying
RTCDtlsTransport.
selectedCandidatePairId of type DOMString
It is a unique identifier that is associated to the object that was inspected to
produce the RTCIceCandidatePairStats associated with this transport.
localCertificateId of type DOMString
For components where DTLS is negotiated, give local certificate.
remoteCertificateId of type DOMString
For components where DTLS is negotiated, give remote certificate.
tlsVersion of type DOMString
For components where DTLS is negotiated, the TLS version agreed. Only present
after DTLS negotiation is complete.
The value comes from ServerHello.supported_versions if present, otherwise from
ServerHello.version. It is represented as four upper case
hexadecimal digits, representing the two
bytes of the version.
dtlsCipher of type DOMString
Descriptive name of the cipher suite used for the DTLS transport, as defined in
the "Description" column of the IANA cipher suite registry [IANA-TLS-CIPHERS].
srtpCipher of type DOMString
Descriptive name of the protection profile used for the SRTP transport, as
defined in the "Profile" column of the IANA DTLS-SRTP protection profile registry
[IANA-DTLS-SRTP] and described further in [RFC5764].
tlsGroup of type DOMString
Descriptive name of the group used for the encryption, as defined in the
"Description" column of the IANA TLS Supported Groups registry
[IANA-TLS-GROUPS].
selectedCandidatePairChanges of type unsigned long
The number of times that the selected candidate pair of this transport has
changed. Going from not having a selected candidate pair to having a
selected candidate pair, or the other way around, also increases this
counter. It is initially zero and becomes one when an initial candidate
pair is selected.
7.28
RTCSctpTransportStats dictionary
An RTCSctpTransportStats object represents the stats
corresponding to an RTCSctpTransport described in
[WEBRTC].
The latest smoothed round-trip time value, corresponding to
spinfo_srtt defined in [RFC6458] but converted to seconds.
If there has been no round-trip time measurements yet, this
value is undefined.
7.29
RTCIceCandidateStats dictionary
RTCIceCandidateStats reflects the properties of a candidate in
Section 15.1 of [RFC5245]. It corresponds to a RTCIceCandidate object.
It is a unique identifier that is associated to the object that was inspected to
produce the RTCTransportStats associated with this candidate.
address of type DOMString
It is the address of the candidate, allowing for IPv4 addresses, IPv6 addresses,
and fully qualified domain names (FQDNs). See [RFC5245] section 15.1 for
details.
The user agent should make sure that only remote candidate addresses that the web
application has configured on the corresponding RTCPeerConnection are exposed;
This is especially important for peer reflexive remote candidates. By default,
the user agent MUST leave the 'address' member as null in the
RTCICECandidateStats dictionary of any remote candidate. Once a RTCPeerConnection
instance learns on an address by the web application using addIceCandidate,
the user agent can expose the 'address' member value in any remote
RTCIceCandidateStats dictionary of the corresponding RTCPeerConnection that
matches the newly learnt address.
port of type long
It is the port number of the candidate.
protocol of type DOMString
Valid values for transport is one of udp and tcp. Based
on the "transport" defined in [RFC5245] section 15.1.
relayProtocol of type DOMString
It is the protocol used by the endpoint to communicate with the TURN server. This
is only present for local candidates. Valid values are udp,
tcp, or tls.
It is a unique identifier that is associated to the object that was inspected to
produce the RTCTransportStats associated with this candidate pair.
localCandidateId of type DOMString
It is a unique identifier that is associated to the object that was inspected to
produce the RTCIceCandidateStats for the local candidate associated
with this candidate pair.
remoteCandidateId of type DOMString
It is a unique identifier that is associated to the object that was inspected to
produce the RTCIceCandidateStats for the remote candidate associated
with this candidate pair.
Represents the state of the checklist for the local and remote candidates in a
pair.
nominated of type boolean
Related to updating the nominated flag described in Section 7.1.3.2.4 of
[RFC5245].
packetsSent of type unsigned
long long
Represents the total number of packets sent on this candidate pair.
packetsReceived of type unsigned long long
Represents the total number of packets received on this candidate pair.
bytesSent of type unsigned long
long
Represents the total number of payload bytes sent on this candidate pair, i.e.,
not including headers or padding.
bytesReceived of type unsigned
long long
Represents the total number of payload bytes received on this candidate pair,
i.e., not including headers or padding.
lastPacketSentTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last packet was sent on this particular
candidate pair, excluding STUN packets.
lastPacketReceivedTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last packet was received on this particular
candidate pair, excluding STUN packets.
firstRequestTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the first STUN request was sent on this
particular candidate pair.
lastRequestTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last STUN request was sent on this
particular candidate pair. The average interval between two consecutive
connectivity checks sent can be calculated with (lastRequestTimestamp -
firstRequestTimestamp) / requestsSent.
lastResponseTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the last STUN response was received on this
particular candidate pair.
totalRoundTripTime of type double
Represents the sum of all round trip time measurements in seconds since the
beginning of the session, based on STUN connectivity check [STUN-PATH-CHAR]
responses (responsesReceived), including those that reply to requests that are
sent in order to verify consent [RFC7675]. The average round trip time can be
computed from totalRoundTripTime by dividing it by
responsesReceived.
currentRoundTripTime of type double
Represents the latest round trip time measured in seconds, computed from both
STUN connectivity checks [STUN-PATH-CHAR], including those that are sent for
consent verification [RFC7675].
availableOutgoingBitrate of type double
It is calculated by the underlying congestion control by combining the available
bitrate for all the outgoing RTP streams using this candidate pair. The bitrate
measurement does not count the size of the IP or other transport layers like TCP
or UDP. It is similar to the TIAS defined in [RFC3890], i.e., it is measured
in bits per second and the bitrate is calculated over a 1 second window.
Implementations that do not calculate a sender-side estimate MUST leave this
undefined. Additionally, the value MUST be undefined for candidate pairs that
were never used. For pairs in use, the estimate is normally no lower than the
bitrate for the packets sent at lastPacketSentTimestamp, but might
be higher. For candidate pairs that are not currently in use but were used
before, implementations MUST return undefined.
availableIncomingBitrate of type double
It is calculated by the underlying congestion control by combining the available
bitrate for all the incoming RTP streams using this candidate pair. The bitrate
measurement does not count the size of the IP or other transport layers like TCP
or UDP. It is similar to the TIAS defined in [RFC3890], i.e., it is measured
in bits per second and the bitrate is calculated over a 1 second window.
Implementations that do not calculate a receiver-side estimate MUST leave this
undefined. Additionally, the value should be undefined for candidate pairs that
were never used. For pairs in use, the estimate is normally no lower than the
bitrate for the packets received at lastPacketReceivedTimestamp, but
might be higher. For candidate pairs that are not currently in use but were used
before, implementations MUST return undefined.
circuitBreakerTriggerCount of type unsigned long
Represents the number of times the circuit breaker is triggered for this
particular 5-tuple. Ceasing transmission when a circuit breaker is triggered is
defined in Section 4.5 of [RFC8083]. The field MUST return undefined for
user-agents that do not implement the circuit-breaker algorithm.
requestsReceived of type unsigned long long
Represents the total number of connectivity check requests received (including
retransmissions). It is impossible for the receiver to tell whether the request
was sent in order to check connectivity or check consent, so all connectivity
checks requests are counted here.
requestsSent of type unsigned
long long
Represents the total number of connectivity check requests sent (not including
retransmissions).
responsesReceived of type unsigned long long
Represents the total number of connectivity check responses received.
responsesSent of type unsigned
long long
Represents the total number of connectivity check responses sent. Since we cannot
distinguish connectivity check requests and consent requests, all responses are
counted.
retransmissionsReceived of type unsigned long long
Represents the total number of connectivity check request retransmissions
received. Retransmissions are defined as connectivity check requests with a
TRANSACTION_TRANSMIT_COUNTER attribute where the "req" field is larger than 1, as
defined in [RFC7982].
retransmissionsSent of type unsigned long long
Represents the total number of connectivity check request retransmissions sent.
consentRequestsSent of type unsigned long long
Represents the total number of consent requests sent.
consentExpiredTimestamp of type DOMHighResTimeStamp
Represents the timestamp at which the latest valid STUN binding response expired,
as defined in [RFC7675] section 5.1. If a valid STUN binding response has not
been made (responsesReceived is zero) or the latest one has not
expired this value must be undefined.
packetsDiscardedOnSend of type unsigned long
Total number of packets for this candidate pair that have been discarded due to socket
errors, i.e. a socket error occured when handing the packets to the socket. This
might happen due to various reasons, including full buffer or no available
memory.
bytesDiscardedOnSend of type unsigned long long
Total number of bytes for this candidate pair that have been discarded due to socket
errors, i.e. a socket error occured when handing the packets containing the bytes
to the socket. This might happen due to various reasons, including full buffer or
no available memory. Calculated as defined in [RFC3550] section 6.4.1.
The fingerprint of the certificate. Only use the fingerprint value as defined in
Section 5 of [RFC4572].
fingerprintAlgorithm of type DOMString
The hash function used to compute the certificate fingerprint. For instance,
"sha-256".
base64Certificate of type DOMString
The DER-encoded base-64 representation of the certificate.
issuerCertificateId of type DOMString
The issuerCertificateId refers to the stats object that contains the next
certificate in the certificate chain. If the current certificate is at the end of
the chain (i.e. a self-signed certificate), this will not be set.
This entire dictionary was made obsolete in September, 2019 due to
sender, receiver and transceiver stats objects being a better fit to
describe the modern RTCPeerConnection model (Unified Plan).
streamIdentifier of type DOMString
stream.id property
trackIds of type sequence<DOMString>
This is the id of the stats object, not the track.id.
The RTCReceiverVideoTrackAttachmentStats is a copy of
RTCVideoReceiverStats. It adds no new information, it
only exists for backwards-compatibility reasons as an obsolete
dictionary.
The RTCReceiverAudioTrackAttachmentStats is a copy of
RTCAudioReceiverStats. It adds no new information, it
only exists for backwards-compatibility reasons as an obsolete
dictionary.
This field was obsoleted by the statsended event in
[WEBRTC], which has now also been obsolete.
deleted is no longer applicable beacause if the ice
candidate is deleted it no longer appears in getStats().
isRemote of type boolean
false indicates that this represents a local candidate;
true indicates that this represents a remote candidate.
This was moved to RTCOutboundRtpStreamStats and
RTCInboundRtpStreamStats in August 2019.
frameHeight of type unsigned
long
This was moved to RTCOutboundRtpStreamStats and
RTCInboundRtpStreamStats in August 2019.
framesPerSecond of type double
For the sending case, this was replaced by RTCVideoSourceStats.framesPerSecond
in May 2019 representing the frame rate of the track. For the receiving case,
this was moved to RTCInboundRtpStreamStats in August 2019 representing the
decoding frame rate. In August 2019, framesPerSecond was also added to
RTCOutboundRtpStreamStats, representing the encoding frame rate (which may be
lower than the source frame rate).
This field was replaced by keyFramesEncoded in RTCOutboundRtpStreamStats in June
2019. There were no known implementations supporting the old field at the time of
the change.
framesCaptured of type unsigned
long
This was replaced by RTCVideoSourceStats.frames in May 2019.
framesSent of type unsigned
long
This was moved to RTCOutboundRtpStreamStats in August 2019.
hugeFramesSent of type unsigned
long
This was moved to RTCOutboundRtpStreamStats in August 2019.
This field was replaced by keyFramesDecoded in RTCInboundRtpStreamStats in June
2019. There were no known implementations supporting the old field at the time of
the change.
estimatedPlayoutTimestamp of type DOMHighResTimeStamp
This was moved to RTCInboundRtpStreamStats in August 2019.
jitterBufferDelay of type double
This was moved to RTCInboundRtpStreamStats in August 2019.
jitterBufferEmittedCount of type unsigned long long
This was moved to RTCInboundRtpStreamStats in August 2019.
framesReceived of type unsigned
long
This was moved to RTCInboundRtpStreamStats in August 2019.
framesDecoded of type unsigned
long
This was moved to RTCInboundRtpStreamStats in August 2019.
framesDropped of type unsigned
long
This was moved to RTCInboundRtpStreamStats in August 2019.
partialFramesLost of type unsigned long
This was moved to RTCInboundRtpStreamStats in August 2019.
fullFramesLost of type unsigned
long
This was moved to RTCInboundRtpStreamStats in August 2019.
9.
Examples
9.1
Example of a stats application
Consider the case where the user is experiencing bad sound and the application wants to
determine if the cause of it is packet loss. The following example code might be used:
var baselineReport, currentReport;
var sender = pc.getSenders()[0];
sender.getStats().then(function (report) {
baselineReport = report;
})
.then(function() {
returnnewPromise(function(resolve) {
setTimeout(resolve, aBit); // ... wait a bit
});
})
.then(function() {
return sender.getStats();
})
.then(function (report) {
currentReport = report;
processStats();
})
.catch(function (error) {
console.log(error.toString());
});
functionprocessStats() {
// compare the elements from the current report with the baselinefor (let now of currentReport.values()) {
if (now.type != "outbound-rtp")
continue;
// get the corresponding stats from the baseline reportlet base = baselineReport.get(now.id);
if (base) {
remoteNow = currentReport[now.remoteId];
remoteBase = baselineReport[base.remoteId];
var packetsSent = now.packetsSent - base.packetsSent;
var packetsReceived = remoteNow.packetsReceived - remoteBase.packetsReceived;
// if intervalFractionLoss is > 0.3, we've probably found the culpritvar intervalFractionLoss = (packetsSent - packetsReceived) / packetsSent;
}
});
}
10.
Security and Privacy Considerations
The data exposed by WebRTC Statistics include most of the media and network data also
exposed by [GETUSERMEDIA] and [WEBRTC] - as such, all the privacy and security
considerations of these specifications related to data exposure apply as well to this
specifciation.
For instance, the round-trip time exposed in RTCRemoteInboundRtpStreamStats can give
some coarse indication on how far aparts the peers are located, and thus, if one of the
peer's location is known, this may reveal information about the other peer.
When applied to isolated streams, media metrics may allow an application to infer some
characteristics of the isolated stream, such as if anyone is speaking (by watching the
audioLevel statistic).
The following stats are deemed to be sensitive, and MUST NOT be reported for an
isolated media stream:
[#114] Minor clarification regarding stats object lifetime
[#157] Change type of RTCRTPStreamStats.ssrc from string to unsigled long
[#149] Remove references saying "defines an API"
[#148] Explanation for "remoteSource"
[#156] frameWidth/frameHeight: use last decoded value
[#123] Explain "sum and count" design paradigm
[#126] Fix RTCStatsType for "stream"
[#127] Added "kind" to RTCMediaStreamTrackStats
[#129] Remove ssrcids field
[#128] Define audio level rigidly
[#122] Replace RTCTransportStats.active with .dtlsState
[#125] Added more datachannel counters, with definitions
[#142] Rename RTCRtpMediaStreamStats.trackId
[#167] Moving roundTripTime from outbound to inbound
[#139] Define terminology for "stats object" et al
[#169] Fix issues with TURN URL protocol
[#168] Align codec types with webrtc-pc
[#166] Removed cancelled and renamed inprogress to in-progress
[#164] Added remoteTimestamp to RTCRtpStreamTrackStats
[#138] Make a RTCMediaStreamTackStats object per track attachment
[#165] Remove separation of received consent and connectivity requests
[#179] Adds definitions for RTCDataChannelStats members
[#176] Add link from datachannel to transport
[#181] Add section on obsoleted stats
[#182] Bandwidth estimations again
[#188] RTT undefined when no RTCP RR
11.3
Changes since 21 sep 2016
[#64] Added text about which specification is authoriative
[#59] Example conformance specification
[#71] Introduced the term "RTP stream"
[#76] Adding missing "codec" RTCStatsType
[#68] Design considerations section
[#73] Fix the summary of RTCTransportStats
[#75] Clarify that pliCount is only valid for video
[#70] Added QP statistcs
[#90] Adding "deleted" property to RTCIceCandidate
[#93] Adding isRemote to RTCIceCandidate
[#95] Rename "RTT" to RoundTripTime
[#94] Add TransportId to RTCIceCandidateStats
[#43] Added procedures for new stats
This list does not include infrastructure and minor editorials.
11.4
Changes since 26 May 2016
[#54] Debug problems with ICE.
[#52] adding XRBLOCK metrics
[#51] Dashed enums and crosslinking to stats objects.
[#37] Clarified RTT units.
11.5
Changes since 23 October 2015
[#18] Updated spec changes.
[#17] Changed "remoteId" to "associateStatsId".
[#8] Ended and detached stats for a track.
[#33] Added the codec "implementation" variable.
[#34]Converted to WebIDL contiguious mode.
[#36] Aligned RTCIceCandidateStats with RTCIceCandidate.
[#24] Added packetsDiscarded and packetsRepaired to stats.
[#13] Aligned bitrate to the TIAS definition.
[#47] Changed RTCCodec to RTCCodecStats
Various formatting, layout and link fixes.
11.6
Changes since 03 February 2015
[#10] Added RTCRTPStreamStats.mediaType.
11.7
Changes since 30 September 2014
kept getStats() in webrtc-pc. Changed RTCStatsType from enum to DOMString.
Added "datachannel" to RTCStatsType.
Added fractionLost to RTCInboundRTPStreamStats.
Clarified that bytesSent and bytesReceived do no include headers or paddings.
11.8
Acknowledgements
The editors wish to thank the Working Group chairs, Stefan Håkansson, and the Team
Contact, Dominique Hazaël-Massieux, for their support. The editors would like to thank
Bernard Aboba, Taylor Brandstetter, Henrik Boström, Jan-Ivar Bruaroey, Karthik Budigere,
Cullen Jennings, and Lennart Schulte for their contributions to this specification.