<?xml version='1.0' encoding='utf-8'?>
<!DOCTYPE rfc [
  <!ENTITY nbsp    "&#160;">
  <!ENTITY zwsp   "&#8203;">
  <!ENTITY nbhy   "&#8209;">
  <!ENTITY wj     "&#8288;">
]>
<?xml-stylesheet type="text/xsl" href="rfc2629.xslt" ?>
<!-- generated by https://github.com/cabo/kramdown-rfc version 1.6.32 (Ruby 2.6.10) -->
<rfc xmlns:xi="http://www.w3.org/2001/XInclude" ipr="trust200902" docName="draft-kpugin-rush-02" category="info" tocInclude="true" sortRefs="true" symRefs="true" version="3">
  <!-- xml2rfc v2v3 conversion 3.17.1 -->
  <front>
    <title abbrev="rush">RUSH - Reliable (unreliable) streaming protocol</title>
    <seriesInfo name="Internet-Draft" value="draft-kpugin-rush-02"/>
    <author initials="K." surname="Pugin" fullname="Kirill Pugin">
      <organization>Facebook</organization>
      <address>
        <email>ikir@meta.com</email>
      </address>
    </author>
    <author initials="A." surname="Frindell" fullname="Alan Frindell">
      <organization>Facebook</organization>
      <address>
        <email>afrind@meta.com</email>
      </address>
    </author>
    <author initials="J." surname="Cenzano" fullname="Jordi Cenzano">
      <organization>Facebook</organization>
      <address>
        <email>jcenzano@meta.com</email>
      </address>
    </author>
    <author initials="J." surname="Weissman" fullname="Jake Weissman">
      <organization>Facebook</organization>
      <address>
        <email>jakeweissman@meta.com</email>
      </address>
    </author>
    <date year="2023" month="May" day="11"/>
    <area>General</area>
    <workgroup>TODO Working Group</workgroup>
    <keyword>Internet-Draft</keyword>
    <abstract>
      <?line 45?>

<t>RUSH is an application-level protocol for ingesting live video.
This document describes the protocol and how it maps onto QUIC.</t>
    </abstract>
    <note removeInRFC="true">
      <name>Discussion Venues</name>
      <t>Discussion of this document takes place on the
    mailing list (),
  which is archived at <eref target=""/>.</t>
      <t>Source for this draft and an issue tracker can be found at
  <eref target="https://github.com/afrind/draft-rush"/>.</t>
    </note>
  </front>
  <middle>
    <?line 50?>

<section anchor="introduction">
      <name>Introduction</name>
      <t>RUSH is a bidirectional application level protocol designed for live video
ingestion that runs on top of QUIC.</t>
      <t>RUSH was built as a replacement for RTMP (Real-Time Messaging Protocol) with the
goal to provide support for new audio and video codecs, extensibility in the
form of new message types, and multi-track support. In addition, RUSH gives
applications option to control data delivery guarantees by utilizing QUIC
streams.</t>
      <t>This document describes the RUSH protocol, wire format, and QUIC mapping.</t>
    </section>
    <section anchor="conventions-and-definitions">
      <name>Conventions and Definitions</name>
      <t>The key words "MUST", "MUST NOT", "REQUIRED", "SHALL", "SHALL NOT", "SHOULD",
"SHOULD NOT", "RECOMMENDED", "NOT RECOMMENDED", "MAY", and "OPTIONAL" in this
document are to be interpreted as described in BCP 14 <xref target="RFC2119"/> <xref target="RFC8174"/>
when, and only when, they appear in all capitals, as shown here.</t>
      <dl>
        <dt>Frame/Message:</dt>
        <dd>
          <t>logical unit of information that client and server can exchange</t>
        </dd>
        <dt>PTS:</dt>
        <dd>
          <t>presentation timestamp</t>
        </dd>
        <dt>DTS:</dt>
        <dd>
          <t>decoding timestamp</t>
        </dd>
        <dt>AAC:</dt>
        <dd>
          <t>advanced audio codec</t>
        </dd>
        <dt>NALU:</dt>
        <dd>
          <t>network abstract layer unit</t>
        </dd>
        <dt>VPS:</dt>
        <dd>
          <t>video parameter set (H265 video specific NALU)</t>
        </dd>
        <dt>SPS:</dt>
        <dd>
          <t>sequence parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>PPS:</dt>
        <dd>
          <t>picture parameter set (H264/H265 video specific NALU)</t>
        </dd>
        <dt>ADTS header:</dt>
        <dd>
          <t><em>Audio Data Transport Stream Header</em></t>
        </dd>
        <dt>ASC:</dt>
        <dd>
          <t>Audio specific config</t>
        </dd>
        <dt>GOP:</dt>
        <dd>
          <t>Group of pictures, specifies the order in which intra- and inter-frames are
arranged.</t>
        </dd>
      </dl>
    </section>
    <section anchor="theory-of-operations">
      <name>Theory of Operations</name>
      <section anchor="connection-establishment">
        <name>Connection establishment</name>
        <t>In order to live stream using RUSH, the client establishes a QUIC connection
using the ALPN token "rush".</t>
        <t>After the QUIC connection is established, client creates a new bidirectional
QUIC stream, choses starting frame ID and sends <tt>Connect</tt> frame
<xref target="connect-frame"/> over that stream.  This stream is called the Connect Stream.</t>
        <t>The client sends <tt>mode of operation</tt> setting in <tt>Connect</tt> frame <xref target="connect-frame"/> payload.</t>
        <t>One connection SHOULD only be used to send one media stream, for now 1 video and 1 audio track are supported. In the future we could send multiple tracks per stream.</t>
      </section>
      <section anchor="sending-video-data">
        <name>Sending Video Data</name>
        <t>The client can choose to wait for the <tt>ConnectAck</tt> frame <xref target="connect-ack-frame"/>
or it can start optimistically sending data immediately after sending the <tt>Connect</tt> frame.</t>
        <t>A track is a logical organization of the data, for example, video can have one
video track, and two audio tracks (for two languages). The client can send data
for multiple tracks simultaneously.</t>
        <t>The encoded audio or video data of each track is serialized into frames (see
<xref target="audio-frame"/> or <xref target="video-frame"/>) and transmitted from the client to the
server.  Each track has its own monotonically increasing frame ID sequence. The
client MUST start with initial frame ID = 1.</t>
        <t>Depending on mode of operation (<xref target="quic-mapping"/>), the client sends audio and
video frames on the Connect stream or on a new QUIC stream for each frame.</t>
        <t>In <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the client can stop sending a
frame by resetting the corresponding QUIC stream. In this case, there is no
guarantee that the frame was received by the server.</t>
      </section>
      <section anchor="receiving-data">
        <name>Receiving data</name>
        <t>Upon receiving <tt>Connect</tt> frame <xref target="connect-frame"/>, if the server accepts the stream, the server will reply with <tt>ConnectAck</tt> frame <xref target="connect-ack-frame"/> and it will prepare to receive audio/video data.</t>
        <t>It's possible that in <tt>Multi Stream Mode</tt> (<xref target="multi-stream-mode"/>), the server
receives audio or video data before it receives the <tt>Connect</tt> frame <xref target="connect-frame"/>.  The
implementation can choose whether to buffer or drop the data.
The audio/video data cannot be interpreted correctly before the arrival of the <tt>Connect</tt> frame <xref target="connect-frame"/>.</t>
        <t>In <tt>Single Stream Mode</tt> (<xref target="single-stream-mode"/>), it is guaranteed by the transport that
frames arrive into the application layer in order they were sent.</t>
        <t>In <tt>Multi Stream Mode</tt>, it's possible that frames arrive at the application
layer in a different order than they were sent, therefore the server MUST keep
track of last received frame ID for every track that it receives. A gap in the
frame sequence ID on a given track can indicate out of order delivery and the
server MAY wait until missing frames arrive. The server must consider frame lost
if the corresponding QUIC stream was reset.</t>
        <t>Upon detecting a gap in the frame sequence, the server MAY wait for the missing
frames to arrive for an implementation defined time. If missing frames don't
arrive, the server SHOULD consider them lost and continue processing rest of the
frames. For example if the server receives the following frames for track 1: <tt>1
2 3 5 6</tt> and frame <tt>#4</tt> hasn't arrived after implementation defined timeout,
thee server SHOULD continue processing frames <tt>5</tt> and <tt>6</tt>.</t>
        <t>It is worth highlighting that in multi stream mode there is a need for a de-jitter function (that introduces latency). Also the subsequent processing pipeline should tolerate lost frames, so "holes" in the audio / video streams.</t>
        <t>When the client is done streaming, it sends the <tt>End of Video</tt> frame
(<xref target="end-of-video-frame"/>) to indicate to the server that there won't be any more
data sent.</t>
      </section>
      <section anchor="reconnect">
        <name>Reconnect</name>
        <t>If the QUIC connection is closed at any point, client MAY reconnect by simply
repeat the <tt>Connection establishment</tt> process (<xref target="connection-establishment"/>) and
resume sending the same video where it left off.  In order to support
termination of the new connection by a different server, the client SHOULD
resume sending video frames starting with I-frame, to guarantee that the video
track can be decoded from the 1st frame sent.</t>
        <t>Reconnect can be initiated by the server if it needs to "go away" for
maintenance. In this case, the server sends a <tt>GOAWAY</tt> frame (<xref target="goaway-frame"/>)
to advise the client to gracefully close the connection.  This allows client to
finish sending some data and establish new connection to continue sending
without interruption.</t>
      </section>
    </section>
    <section anchor="wire-format">
      <name>Wire Format</name>
      <section anchor="frame-header">
        <name>Frame Header</name>
        <t>The client and server exchange information using frames. There are different
types of frames and the payload of each frame depends on its type.</t>
        <t>The bytes in the wire are in <strong>big endian</strong></t>
        <t>Generic frame format:</t>
        <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
|Type(8)| Payload ...                                          |
+-------+------------------------------------------------------+
]]></artwork>
        <dl>
          <dt>Length(64)`:</dt>
          <dd>
            <t>Each frame starts with length field, 64 bit size that tells size of the frame
in bytes (including predefined fields, so if LENGTH is 100 bytes, then PAYLOAD
length is 100 - 8 - 8 - 1 = 82 bytes).</t>
          </dd>
          <dt>ID(64):</dt>
          <dd>
            <t>64 bit frame sequence number, every new frame MUST have a sequence ID greater
than that of the previous frame within the same track.  Track ID would be
specified in each frame. If track ID is not specified it's 0 implicitly.</t>
          </dd>
          <dt>Type(8):</dt>
          <dd>
            <t>1 byte representing the type of the frame.</t>
          </dd>
        </dl>
        <t>Predefined frame types:</t>
        <table>
          <thead>
            <tr>
              <th align="left">Frame Type</th>
              <th align="left">Frame</th>
            </tr>
          </thead>
          <tbody>
            <tr>
              <td align="left">0x0</td>
              <td align="left">connect frame</td>
            </tr>
            <tr>
              <td align="left">0x1</td>
              <td align="left">connect ack frame</td>
            </tr>
            <tr>
              <td align="left">0x2</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x3</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x4</td>
              <td align="left">end of video frame</td>
            </tr>
            <tr>
              <td align="left">0x5</td>
              <td align="left">error frame</td>
            </tr>
            <tr>
              <td align="left">0x6</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x7</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x8</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x9</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xA</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0XB</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xC</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0xD</td>
              <td align="left">video frame</td>
            </tr>
            <tr>
              <td align="left">0xE</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0XF</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0X10</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x11</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x12</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x13</td>
              <td align="left">reserved</td>
            </tr>
            <tr>
              <td align="left">0x14</td>
              <td align="left">audio frame</td>
            </tr>
            <tr>
              <td align="left">0x15</td>
              <td align="left">GOAWAY frame</td>
            </tr>
            <tr>
              <td align="left">0x16</td>
              <td align="left">Timed metadata</td>
            </tr>
          </tbody>
        </table>
      </section>
      <section anchor="frames">
        <name>Frames</name>
        <section anchor="connect-frame">
          <name>Connect frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+---------------+---------------+--------------+
| 0x0   |Version|Video Timescale|Audio Timescale|              |
+-------+-------+---------------+---------------+--------------+
|                    Live Session ID(64)                       |
+--------------------------------------------------------------+
| Payload ...                                                  |
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Version (unsigned 8bits):</dt>
            <dd>
              <t>version of the protocol (initial version is 0x0).</t>
            </dd>
            <dt>Video Timescale(unsigned 16bits):</dt>
            <dd>
              <t>timescale for all video frame timestamps on this connection. For instance 25</t>
            </dd>
            <dt>Audio Timescale(unsigned 16bits):</dt>
            <dd>
              <t>timescale for all audio samples timestamps on this connection, recommended
value same as audio sample rate, for example 44100</t>
            </dd>
            <dt>Live Session ID(unsigned 64bits):</dt>
            <dd>
              <t>identifier of broadcast, when reconnect, client MUST use the same live session
ID</t>
            </dd>
            <dt>Payload:</dt>
            <dd>
              <t>application and version specific data that can be used by the server. OPTIONAL
A possible implementation for this could be to add in the payload a UTF-8 encoded JSON data that specifies some parameters that server needs to authenticate / validate that connection, for instance:
~~~
payloadBytes = strToJSonUtf8('{"url": "/rtmp/BID?s_bl=1&amp;s_l=3&amp;s_sc=VALID&amp;s_sw=0&amp;s_vt=usr_dev&amp;a=TOKEN"}')
~~~</t>
            </dd>
          </dl>
          <t>This frame is used by the client to initiate broadcasting. The client can start
sending other frames immediately after Connect frame <xref target="connect-frame"/> without waiting
acknowledgement from the server.</t>
          <t>If server doesn't support VERSION sent by the client, the server sends an Error
frame <xref target="error-frame"/> with code <tt>UNSUPPORTED VERSION</tt>.</t>
          <t>If audio timescale or video timescale are 0, the server sends error frame <xref target="error-frame"/> with
error code <tt>INVALID FRAME FORMAT</tt> and closes connection.</t>
          <t>If the client receives a Connect frame from the server, the client sends an
Error frame <xref target="error-frame"/> with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="connect-ack-frame">
          <name>Connect Ack frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                       Length (64) = 17                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x1   |
+-------+
]]></artwork>
          <t>The server sends the "Connect Ack" frame in response to "Connect" <xref target="connect-frame"/> frame
indicating that server accepts "version" and the stream is authenticated / validated (optional), so it is ready to receive data.</t>
          <t>If the client doesn't receive "Connect Ack" frame from the server within a
timeout, it will close the connection.  The timeout value is chosen by the
implementation.</t>
          <t>There can be only one "Connect Ack" frame sent over lifetime of the QUIC
connection.</t>
          <t>If the server receives a Connect Ack frame from the client, the client sends an
Error frame with code <tt>TBD</tt>.</t>
        </section>
        <section anchor="end-of-video-frame">
          <name>End of Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64) = 17                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x4   |
+-------+
]]></artwork>
          <t>End of Video frame is sent by a client when it's done sending data and is about
to close the connection. The server SHOULD ignore all frames sent after that.</t>
        </section>
        <section anchor="error-frame">
          <name>Error frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64) = 29                       |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x5   |
+-------+------------------------------------------------------+
|                   Sequence ID (64)                           |
+------------------------------+-------------------------------+
|      Error Code (32)         |
+------------------------------+
]]></artwork>
          <dl>
            <dt>Sequence ID(unsigned 64bits):</dt>
            <dd>
              <t>ID of the frame sent by the client that error is generated for, ID=0x0
indicates connection level error.</t>
            </dd>
            <dt>Error Code(unsigned 32bits):</dt>
            <dd>
              <t>Indicates the error code</t>
            </dd>
          </dl>
          <t>Error frame can be sent by the client or the server to indicate that an error
occurred.</t>
          <t>Some errors are fatal and the connection will be closed after sending the Error
frame.</t>
          <t>See section <xref target="connection-errors"/> and <xref target="frame-errors"/> for more information about error codes</t>
        </section>
        <section anchor="video-frame">
          <name>Video frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+-------+----------------------------------------------+
|  0xD  | Codec |
+-------+-------+----------------------------------------------+
|                        PTS (64)                              |
+--------------------------------------------------------------+
|                        DTS (64)                              |
+-------+------------------------------------------------------+
|TrackID|                                                      |
+-------+-------+----------------------------------------------+
| I Offset      | Video Data ...                               |
+---------------+----------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec (unsigned 8bits):</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
            <dt>PTS (signed 64bits):</dt>
            <dd>
              <t>presentation timestamp in connection video timescale</t>
            </dd>
            <dt>DTS (signed 64bits):</dt>
            <dd>
              <t>decoding timestamp in connection video timescale</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">H264</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">H265</td>
              </tr>
              <tr>
                <td align="left">0x3</td>
                <td align="left">VP8</td>
              </tr>
              <tr>
                <td align="left">0x4</td>
                <td align="left">VP9</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>I Offset (unsigned 16bits):</dt>
            <dd>
              <t>Distance from sequence ID of the I-frame that is required before this frame
can be decoded. This can be useful to decide if frame can be dropped.</t>
            </dd>
            <dt>Video Data:</dt>
            <dd>
              <t>variable length field, that carries actual video frame data that is codec
dependent</t>
            </dd>
          </dl>
          <t>For h264/h265 codec, "Video Data" are 1 or more NALUs in AVCC format (4 bytes size header):</t>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                    NALU Length (64)                          |
+--------------------------------------------------------------+
|                    NALU Data ...
+--------------------------------------------------------------+
]]></artwork>
          <t>EVERY h264 video key-frame MUST start with SPS/PPS NALUs.
EVERY h265 video key-frame MUST start with VPS/SPS/PPS NALUs.</t>
          <t>Binary concatenation of "video data" from consecutive video frames, without data
loss MUST produce VALID h264/h265 bitstream.</t>
        </section>
        <section anchor="audio-frame">
          <name>Audio frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x14  | Codec |
+-------+-------+----------------------------------------------+
|                      Timestamp (64)                          |
+-------+-------+-------+--------------------------------------+
|TrackID|   Header Len  |
+-------+-------+-------+--------------------------------------+
| Header + Audio Data ...
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Codec (unsigned 8bits):</dt>
            <dd>
              <t>specifies codec that was used to encode this frame.</t>
            </dd>
          </dl>
          <t>Supported type of codecs:</t>
          <table>
            <thead>
              <tr>
                <th align="left">Type</th>
                <th align="left">Codec</th>
              </tr>
            </thead>
            <tbody>
              <tr>
                <td align="left">0x1</td>
                <td align="left">AAC</td>
              </tr>
              <tr>
                <td align="left">0x2</td>
                <td align="left">OPUS</td>
              </tr>
            </tbody>
          </table>
          <dl>
            <dt>Timestamp (signed 64bits):</dt>
            <dd>
              <t>timestamp of first audio sample in Audio Data.</t>
            </dd>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>Header Len (unsigned 16bits):</dt>
            <dd>
              <t>Length in bytes of the audio header contained in the first portion of the payload</t>
            </dd>
            <dt>Audio Data (variable length field):</dt>
            <dd>
              <t>it carries the audio header and 1 or more audio frames that are codec dependent.</t>
            </dd>
          </dl>
          <t>For AAC codec:
- "Audio Data" are 1 or more AAC samples, prefixed with Audio Specific Config (ASC) header defined in <tt>ISO 14496-3</tt>
- Binary concatenation of all AAC samples in "Audio Data" from consecutive audio frames, without data loss MUST produce VALID AAC bitstream.</t>
          <t>For OPUS codec:
- "Audio Data" are 1 or more OPUS samples, prefixed with OPUS header as defined in <xref target="RFC7845"/></t>
        </section>
        <section anchor="goaway-frame">
          <name>GOAWAY frame</name>
          <artwork><![CDATA[
0       1       2       3       4       5       6       7
+--------------------------------------------------------------+
|                          17                                  |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x15  |
+-------+
]]></artwork>
          <t>The GOAWAY frame is used by the server to initiate graceful shutdown of a connection, for example, for server maintenance.</t>
          <t>Upon receiving GOAWAY frame, the client MUST send frames remaining in current GOP and
stop sending new frames on this connection. The client SHOULD establish a new connection and resume sending frames there, so when resume video frame will start with an IDR frame.</t>
          <t>After sending a GOAWAY frame, the server continues processing arriving frames
for an implementation defined time, after which the server SHOULD close
the connection.</t>
        </section>
        <section anchor="timedmetadata-frame">
          <name>TimedMetadata frame</name>
          <artwork><![CDATA[
+--------------------------------------------------------------+
|                       Length (64)                            |
+--------------------------------------------------------------+
|                       ID (64)                                |
+-------+------------------------------------------------------+
| 0x16  |TrackID|
+-------+-------+----------------------------------------------+
|                      Topic (64)                              |
+--------------------------------------------------------------+
|                      EventMessage (64)                       |
+-------+------------------------------------------------------+
|                      Timestamp (64)                          |
+-------+------------------------------------------------------+
|                      Duration (64)                           |
+-------+------------------------------------------------------+
| Payload ...
+--------------------------------------------------------------+
]]></artwork>
          <dl>
            <dt>Track ID (unsigned 8bits):</dt>
            <dd>
              <t>ID of the track that this frame is on</t>
            </dd>
            <dt>Timestamp (signed 64bits):</dt>
            <dd>
              <t>PTS of the event</t>
            </dd>
            <dt>Topic (unsigned 64bits):</dt>
            <dd>
              <t>A unique identifier of the app level feature. May be used to decode payload or do other application specific processing</t>
            </dd>
            <dt>EventMessage (unsigned 64bits):</dt>
            <dd>
              <t>A unique identifier of the event message used for app level events deduplication</t>
            </dd>
            <dt>Duration (unsigned 64bits):</dt>
            <dd>
              <t>duration of the event in video PTS timescale. Can be 0.</t>
            </dd>
            <dt>Payload:</dt>
            <dd>
              <t>variable length field. May be used by the app to send additional event metadata. UTF-8 JSON recommended</t>
            </dd>
          </dl>
        </section>
      </section>
      <section anchor="quic-mapping">
        <name>QUIC Mapping</name>
        <t>One of the main goals of the RUSH protocol was ability to provide applications a
way to control reliability of delivering audio/video data. This is achieved by
using a special mode <xref target="multi-stream-mode"/>.</t>
        <section anchor="single-stream-mode">
          <name>Single Stream Mode</name>
          <t>In single stream mode, RUSH uses one bidirectional QUIC stream to send data and receive
data.  Using one stream guarantees reliable, in-order delivery - applications
can rely on QUIC transport layer to retransmit lost packets.  The performance
characteristics of this mode are similar to RTMP over TCP.</t>
        </section>
        <section anchor="multi-stream-mode">
          <name>Multi Stream Mode</name>
          <t>In single stream mode <xref target="single-stream-mode"/>, if packet belonging to video frame is lost, all packets sent
after it will not be delivered to application, even though those packets may
have arrived at the server. This introduces head of line blocking and can
negatively impact latency.</t>
          <t>To address this problem, RUSH defines "Multi Stream Mode", in which one QUIC
stream is used per audio/video frame.</t>
          <t>Connection establishment follows the normal procedure by client sending Connect
frame, after that Video and Audio frames are sent using following rules:</t>
          <ul spacing="normal">
            <li>Each new frame is sent on new bidirectional QUIC stream</li>
            <li>Frames within same track must have IDs that are monotonically increasing,
such that ID(n) = ID(n-1) + 1</li>
          </ul>
          <t>The receiver reconstructs the track using the frames IDs.</t>
          <t>Response Frames (Connect Ack<xref target="connect-ack-frame"/> and Error<xref target="error-frame"/>), will be in the response stream of the
stream that sent it.</t>
          <t>The client MAY control delivery reliability by setting a delivery timer for
every audio or video frame and reset the QUIC stream when the timer fires.  This
will effectively stop retransmissions if the frame wasn't fully delivered in
time.</t>
          <t>Timeout is implementation defined, however future versions of the draft will
define a way to negotiate it.</t>
        </section>
      </section>
    </section>
    <section anchor="error-handling">
      <name>Error Handling</name>
      <t>An endpoint that detects an error SHOULD signal the existence of that error to
its peer.  Errors can affect an entire connection (see <xref target="connection-errors"/>),
or a single frame (see <xref target="frame-errors"/>).</t>
      <t>The most appropriate error code SHOULD be included in the error frame that
signals the error.</t>
      <section anchor="connection-errors">
        <name>Connection Errors</name>
        <t>Affects the the whole connection:</t>
        <t>1 - UNSUPPORTED VERSION - indicates that the server doesn't support version
specified in Connect frame
4- CONNECTION_REJECTED - Indicates the server can not process that connection for any reason</t>
      </section>
      <section anchor="frame-errors">
        <name>Frame errors</name>
        <t>There are two error codes defined in core protocol that indicate a problem with
a particular frame:</t>
        <t>2 - UNSUPPORTED CODEC - indicates that the server doesn't support the given
audio or video codec</t>
        <t>3 - INVALID FRAME FORMAT - indicates that the receiver was not able to parse
the frame or there was an issue with a field's value.</t>
      </section>
    </section>
    <section anchor="extensions">
      <name>Extensions</name>
      <t>RUSH permits extension of the protocol.</t>
      <t>Extensions are permitted to use new frame types (<xref target="wire-format"/>), new error
codes (<xref target="error-frame"/>), or new audio and video codecs (<xref target="audio-frame"/>,
<xref target="video-frame"/>).</t>
      <t>Implementations MUST ignore unknown or unsupported values in all extensible
protocol elements, except <tt>codec id</tt>, which returns an UNSUPPORTED CODEC error.
Implementations MUST discard frames that have unknown or unsupported types.</t>
    </section>
    <section anchor="security-considerations">
      <name>Security Considerations</name>
      <t>RUSH protocol relies on security guarantees provided by the transport.</t>
      <t>Implementation SHOULD be prepared to handle cases when sender deliberately sends
frames with gaps in sequence IDs.</t>
      <t>Implementation SHOULD be prepare to handle cases when server never receives
Connect frame (<xref target="connect-frame"/>).</t>
      <t>A frame parser MUST ensure that value of frame length field (see
<xref target="frame-header"/>) matches actual length of the frame, including the frame
header.</t>
      <t>Implementation SHOULD be prepare to handle cases when sender sends a frame with
large frame length field value.</t>
    </section>
    <section anchor="iana-considerations">
      <name>IANA Considerations</name>
      <t>TODO: add frame type registry, error code registry, audio/video codecs
registry</t>
    </section>
  </middle>
  <back>
    <references>
      <name>Normative References</name>
      <reference anchor="RFC2119">
        <front>
          <title>Key words for use in RFCs to Indicate Requirement Levels</title>
          <author fullname="S. Bradner" initials="S." surname="Bradner">
            <organization/>
          </author>
          <date month="March" year="1997"/>
          <abstract>
            <t>In many standards track documents several words are used to signify the requirements in the specification.  These words are often capitalized. This document defines these words as they should be interpreted in IETF documents.  This document specifies an Internet Best Current Practices for the Internet Community, and requests discussion and suggestions for improvements.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="2119"/>
        <seriesInfo name="DOI" value="10.17487/RFC2119"/>
      </reference>
      <reference anchor="RFC8174">
        <front>
          <title>Ambiguity of Uppercase vs Lowercase in RFC 2119 Key Words</title>
          <author fullname="B. Leiba" initials="B." surname="Leiba">
            <organization/>
          </author>
          <date month="May" year="2017"/>
          <abstract>
            <t>RFC 2119 specifies common key words that may be used in protocol  specifications.  This document aims to reduce the ambiguity by clarifying that only UPPERCASE usage of the key words have the  defined special meanings.</t>
          </abstract>
        </front>
        <seriesInfo name="BCP" value="14"/>
        <seriesInfo name="RFC" value="8174"/>
        <seriesInfo name="DOI" value="10.17487/RFC8174"/>
      </reference>
      <reference anchor="RFC7845">
        <front>
          <title>Ogg Encapsulation for the Opus Audio Codec</title>
          <author fullname="T. Terriberry" initials="T." surname="Terriberry">
            <organization/>
          </author>
          <author fullname="R. Lee" initials="R." surname="Lee">
            <organization/>
          </author>
          <author fullname="R. Giles" initials="R." surname="Giles">
            <organization/>
          </author>
          <date month="April" year="2016"/>
          <abstract>
            <t>This document defines the Ogg encapsulation for the Opus interactive speech and audio codec.  This allows data encoded in the Opus format to be stored in an Ogg logical bitstream.</t>
          </abstract>
        </front>
        <seriesInfo name="RFC" value="7845"/>
        <seriesInfo name="DOI" value="10.17487/RFC7845"/>
      </reference>
    </references>
    <?line 676?>

<section numbered="false" anchor="acknowledgments">
      <name>Acknowledgments</name>
      <t>This draft is the work of many people: Vlad Shubin, Nitin Garg, Milen Lazarov,
Benny Luo, Nick Ruff, Konstantin Tsoy, Nick Wu.</t>
    </section>
  </back>
  <!-- ##markdown-source:
H4sIAH0UXGQAA+08a3PbRpLf51fMMVUbySFpUZYdW1WuO1qSbTnW40TZvtTV
VTQkhiTWIMDgIYWxnd++/RpgAEKWH3J27y6uSiQBmJmenn5Pd/d6PZWHeWR3
defs1ei57ukzG4VmHFm9UcSp/L6pszy1ZhHGM71MkzyZJFFHmfE4tZcwMi2y
eUcFySQ2C5gpSM00771dFrMw7uG73ta2mpjczpJ0tavDeJooFS7TXZ3D23x7
a+sRfGBggV39zMY2NZG6StK3szQplrv6/GT/RL+Bv3H1Z/hMvbUr+CDY1Ydx
btPY5r19XFOpLDdx8IuJkhjgWNlMZQuT5r/8WiS5zXZ1nKhluKv/GzbQ1VmS
wqamGfy2WuAv/6OUKfJ5ku4q3VMa/oUxDPqpr09xK/SEd/hTmIZR5D1O0pmJ
w99NHibxrn5qJnacJG/plV2YMIJdvw3T/1jY3PQnyaI2/7Cvn6ZhHNgo8pYY
RiauP795DTPF79tXedHXezb+3QAKqkVeABLD2vObF/n7hD+/dpk3NswA7T6+
Xpi3tv78E9aBMVcypFpLxUm6gFGXFg5Jnz3d2x4MHu0COQFRlS/gX6/X02YM
VGsmQBZE2mGmAaVmuYzCCa3bi+yljUqC1jADbGJmsxwpLYKp9GUY2KSvzucw
GOi7WNg414HNJmk4tpnO57YaDpSn58mVDnO9MMtMJ3Ge6P98dbjXZ3AWYRBE
VqnvkGrTJCgmCIQHnB6HQZhaemwiH1LdgBQgCGexDQjkClAl0MOAfG5ynRYx
gqHzZKmTqYOF1rsymR4XYZRrgyundhnBGdD2cM6z86NTvXFmTdQ7DxdWH9ks
MzNEy6nAsKmvwnyOGFCzBKCFvQJ4CIbOiuUSeIsmiu2VNkUQJoQeglJPksBO
gO3sb7mNs3AcRmG+AszTZHiMCCwOXNCqVuerpYXvcYZFEeVhD4/1rVunD/jU
JghC3HhX0/ZmgJNMeRgEPCwZMbg+4h+waHIDqET8pSs9K0xqQJzAsY5XusgB
qt9xw4g1xeIvA+x9jBRoaXdIXUBQajWTJQOPUyFtLGHePhLCXhJfwjQEH36w
b6dhTPvIcCWrQdBplHSZ7hy9Gp13uvxTH5/Q72cHMOXZwT7+Pno+fPmy/MV9
MXp+8uolvFfyWzVy7+To6OB4nwfDU914dDT8ucNgd05Ozw9PjocvO3xIYaZK
BIDURpSOLbwCUbxMbQ5kCTTlMBPgmCd7p3qwo9+9E4b98AF+/zf44+Hgx50P
H9TV3Ma8VhJHsGP6EzC6QiawBtlSGxC4E7MMcxMhLWQ6A2aL9dymFnD5NAVB
c5fJFCXAro6SGZx9pAtAKBJUKSEcd0yikHYAq2Y2BRqA6WMgysncABspdXo+
oolgTxl8KCOBG0DJLJYgY/blA6DmJEBS8V4Oh3v0zgSXJp4gSogJiPSVAly+
otegulDTlaJKR2YFgCDMSr0+5fmZa5YGtwg4BmhzvfF8+8F9eZMt7SSchhON
024qNZJxmf21sLB4y9Cdux8Zfyrjl+EkL9LPHj4EvMCxmMCmNM2dIW19H7nt
HFgsI+EwIo7Sz+m7OzBqxAjjj8s5gVen4UypZyen9Jr0Px6nAIfam78VHgRm
sUQvV/NwMkeqTE2PDpkItDfFvWRIt2BxpHjSAZAP8CKwG5gnOPXJEkwQYcLv
iEljFsoaT3cchdkcaV8pkDu8HHAASWEWE7rIkBpQGhAVO0orR+P6LAwm5dyK
B+Hnw5enxzDlWxuLaQUADqeIf3zbGIeao5o46LrFJgBJTguhJK1pFkVTMKzw
/TzJ4DuYIiW9RwjSh/vCGDHIngtBwQW/VO/eyfqMTeDm5JKgA67iaftak6AU
hMBvwIoRsAHuQGYTEuizpBOwZcEF8AkeReKO4gJpj+CDo23Ao9fhWZpVlBg8
2ZPY+tgSKUhiBoRWkSFMCS0LDy2onCA0JW5Ig4FKHwipI0oGwsqsg1D+iR4C
QkJNhDucFsQ4V7h0ETEaWXUtwbCmkZleIkM5FACdjeAj3OBrWgrZpYYZFE5w
VnBYCPCVCVnB4nIOH8PJ23WUwFoOLQoNHJ6Jjpt04iIEiwFPZ0VgIgSkGMMF
4SK38MJMmfv5tb+krIcUKhghS8YJX9/Mw/PEoTg7o9b+BrIysl1nFwBccwNs
BAeh+BFNyZoBBKWP+Uxv0PbhKVjKoLzB8tns6wbCCPG4INoVayeQhfjExDYp
smglhAgSE4jPCWwYxZAQTmAH1oBUKXcKiiM0YCiQmoNjEemykVnkEpqi4pEU
ToUmc482eWMoExdhjopzmiYLX2TAlGgUsX4CpjqoVp+DCgxzsGtACS6SGIyO
WI4xjJH5sxozO11AGFIyO5kSTAlkzJHtAadWjnqsB4CVfbuUk09wqQZn6o13
734twklPDBvYVk3qMUeXNqAcrCAqiWsSQcQFYApesOTyZBXTDGLA0Ryw28UR
HqpTJ0cA3QVCxHYij+shzE2wmAnAMHZUDSRC2wbjDxU+CxsakKTwYJnwZx48
wu4k3DJLswPTw5/gS5XWJEtFEgo0PRreIIktqIsA18I3crwkBc7oneNCpV7B
wjIAH94o+ro6nHqTajOZ2GXOutGJNe/1FfqxaP2vmAQ+VZSwRs15PFhISzEE
ZWt83ncr1sHDyr8HoZdkYPFHgpbwC86PAVeyUNbKp2MLpGIRvvKzFqG1jjxS
W1aFKJUWpc3nSV6wTPGUyeQtplP4DRYOUiAjJ9n6JESa28c5gEWbdjKR1iQn
XUQQ4yxgloSXKDunnwg088EIyAPw2kRkRo/XMAmoAUItqbQkxbw00PCAVGkt
pXioJOIIRN81JYs1LE0htNqvkBHQaL6WRxGCNXKoryZs462lyrXAbwsR/cjJ
bl0TNxYXjiwRKyRPUu+ttUvFghTwHJksr7iyFH8kb8g55C+ZZiui6uuhnpll
6brSuNLqPtxnKYa+aCwzICmFIEgwGqaTghwThr/0Q0kllDJfgxvGyr4APzHS
oK0rwe5QxWpPBiwK2AuQSBbitAxTlGS5ErlwrTgT0QSiry9iJwAinZAcNN4+
dX2fNXFSQutME4HXERLQj5wufoDIqLNagA4wmmTgSYF4nTb3GyTx97niKWoL
i11X7hteLWjbhE/0+MO4oGDNxPKMsNNceEyg6+unlVHSEKM1OTJNoii58sCi
3dIBD3b1xUBt63v6vn5wQYszui6+27lAnQ3wCwoCMas+ggKgkK6CFVu2ubYf
AeXiPq968eCCZC6yObiYINrn4WwewX+i2Fj8kpx1BEC6vdRiqH8lxIRhkt7f
0UQBiipiNqU3ZA6OZcHSERB1PFmBFTaMMhYUWTFmOsl9UJfhEqgdrG1w4dE8
zpMIjQmmU9kIxmd1Zw5vso4jPBb1d53fWcZk3sxt7Ot2itDEtgpak7xjQ4RE
6gFa+1O2tJ1TA7ISPugl017DRgOaLVlW5J+chlPtaOojYaJ4N/EK8AjeJYl9
kYGs11lsw6FMr3PkJoAApIucplkmIQoxZ6wBa6VuEhTXGRLOChTh0oqovLjO
Vb1w2EeNUC3Zq30k9ihMmBXE35W1nyEFM9avmDxyHdkp8s8UdKbvB4szpIBU
APM1wx/NOW+/sAVfjDNOazYaU3sToJoBWbqtZL0c8ql1EZAWC4yDpJUkhuOi
2I1veA8cBbqzKw/ODWErOW+abygwAC3IMyTnOjOQdVdm1UEOUguDej82ZIKv
mY1uCrGV9cWzk+Gb4c9O5cOhzRKcqyRKhYI0uAwz2/AWZrA5Oy3QDSBiEpHv
kO78coMCLKvGKYw7ZvMSx1myYHOGZElJJs0jlGAqiSIZqvAgULORnZMWFHWl
YOcbDIc+pRgccQSF7CQAVPN0vYici8bVoneFJ+9I9aWW/PCSlhQFjJHsnJpk
nerCAqUfx+gNyL8hZwQdKhwsvuB4hSEUkT4UzsV14O87d8bhTOOOTXznjlJ0
bRVOZEIGdVepP/74Q21p/jeQn9vy85783JGf9+XnA/n5o/qh91X/flDvdfu/
lzaeAbdsPNjZvOYL+vf+20EAttFNq9ch+EJIAIJzOM6Nh5vv9akcfr/fv2nd
W4UAiUAxynHPF7tql/14ETMowTKWXxEfzDS0UdDVD3b0GPVW+LuTYTaKMv5b
RCqrrjAWSt0A1z8qAr6mtc6SoOlYoYKMenlw/OycLpwGW1s8jmRQrE+HP788
Ge4rgUK+6OmH8t9AP9YPt3nIJhoX+7gf3I5A2jCA42IxRonOJjSKDv6ADHAK
9JiatTyjkGWqxJQ3zjzDvVyGSZE5FxpQJTxJiokEOko2Euww0RVZFWOwoSU0
TJcQXuQALcvcfU0Oe669b9Ez2SK7LJyEOUeGmIpwswPCADrNfDHglCTKjdq5
wLBT7xQIdpJMIBnei/jDebX7471675OO+wMe663ftuAzp4loLn488B7jhqZu
Jni3De8QxhSNTX50b/3RDjyybBB5qlVe3seXaZqktacP1mf5cf3Rw/VHj9Yf
Ddce/deT9a/21h/tw6N1gA/Wp3u6/miwtT7fYNDyrAWFgxYcDhCJbJ76wAwQ
fazL688RgXixGmi82SZF+75SinTvUF48CJOTHPlLJawL5CZEN/z9g3ATTPXa
phkYFO854o7nkU1MZN/zHVT19zeAoO0I0C0eoYsEhgjL1m95Cl+kDm8RAlKL
cgKYayQZDQ9BkWQkZy/lXakEJPlhw4Wp3QcgwOFAUSM1DrKadfCgnDZ3b9mz
jaKaECmvcCU2jVa6Zz0/pRQRzDIClbV9X6kGqXziiiwpMgo0ZB9fs0tO3wIc
NHBT1KWJCtF6JqtNo9GBrl2r6J0d0N9gejToqgTxwU4JImAAFBlovxTRPU6B
MMA5AecTr+Irt7NyRlGFF+JeEDh8/8mrgGUAqo/pC2f3w4WUBiIHV97ykgDk
S3n2seherh4d1y4LQQ2rsGEjdsJRJ8IfWwAUcAoCZ8I7+9/oV+dPew/Lu54X
o5NjD4jqRpn8oPL+O5PX7JuUnh4mjSH6KEJwV8MZhQEFC2hD3klOPerZJQYQ
gJ6Q9fYYAxbnyYtREr/Kpw83vn/XKdKos6s7d9N8sbz75HD/37NfxtHjwd+y
X6LH9+D/2eTx6+HLw3389erxFvy4zB8XWfpLYC//Zh6fn/x0cNz58P0mcxu5
fkzn8IuP48qBdN5tRQSYrLJ2r4ZGq3LeYkIxcfG11u8Na3qs5bLWeYwYOEQP
EsyYOLmKbDCThCTnm5f3JGC+ySEEiaWQmss5en1wNgIqIee9vrk2RzvWB2jb
KAcZWTo1uChjQ1+8Oh69Oj09OTs/2HdLXDAcciVZsnh5EVE9Qp9xq2V5365q
W1vxBwzB4TEdtH56Njw60E9Pzo6G5xzpIz+/JqbKEJOcWHVX0jiLBmbb7u1i
dXADmALg+ZP9i37dcBk6e/R/hSf8WA9+/JYKt/3fn+gJk7dQm8qJhQZhIhV0
vEPsOKGBqgCvDjgHwX3SaeFp55hS3LQMODfuJDuiCDplgKbKF/GFauBJ1UBv
cCafiTbZpaV4LwwLVv4FpLt0rDGCkxbuo7ZNNnjCeZtGuYB8ee15bYDNuuC9
Zn2NCgmzbGKRSI0LRg42gZAQ5UfJKRi/bgOPBBul20Th1OI6zkCiRMU2KdC8
wjDrDNpMPLhZErRzvh9Y/9P9lv8PDLzTxsDrWOfcFFaBxp0j2XIU3ODbET/b
h67z4ZDHQLUYXG6nbU9SyE0UWJJ4vYpGrYvIUwhX0tVM7gijIp0/lSK2H/1f
p4j7tzXV+r+RF5y7YUc3ovMmwEoImFL2ULJs3Nve/IwVmBk8oFtdHbyW94J0
LZYi6yo2vjBJgkpScr4K7cL4x+BtOtVWM7wkTZ9GAt1XO6kAubddAVLOgEtX
tp6qyVnRCS1Qyg27u4v0bynndIPIc6pkMinSlNJcR+jO0FPKgNVTYP2oVL7e
Rki/jW15IbmW/+fZzTgv3VDz0PoFI60l6ULv3tH31UPKyaNEHe92h0SQhw4J
hv1zVMrXUPyXQ3BrobBPgQADqfo9kenklqZs/3d6PvqEbX1DtOr9z4TgKy6d
8HLhcP9aSD4Xgi84hUN9Mp1iwj5P6aUVf0LIb/0UPhsCEsZMVW3RvSrGQjUR
LLMwA8mlZHNshsM55XUKHuC6SG+v0EB3xRNpDa+cyjfaJluv5rhpopHL/y6v
gLjCia555ILHsZfc6XhXO+CUvddYU0G/b9Pv9+n3e3Bopw/p1x389dF7cBTc
rVUbTivV5uWr5bWgDxaclZTRGq3cDyW8SS5BLZuN55YkC0khQtfr1yJMMZbk
Uu3cgqqeZNHn3IMqxDctqG4MXmPZWDitqzxMq1yS2qpIl+LCJuUC1fptqYQP
0xRpykzywtQju1V4LxSaU3zxT6UcGNidY2HLHAtb6HVXd6qFO6QvB9ppLKx1
ocyA4eu9Pbnw1xs7cg9LV7RcArP5L5oGgBv4NFX3rSQyQeDk0S3dJxy8Pjj7
mQ5Szv6tlaSZtWT30eno7unpiE+yX428f+PI1zCyMVo9CWOTrlBMoAVW5T11
qhzgDnMUpijaSZGXVZtlwpuLg1LiNxheGa+85Pw6zfG/ikiRY135CJlIw+r+
8S8TqQnB14XNdv4ME+m8VDifypBfCEndQOEMLDzT25naTfiD9gr/bo/Fb9mk
uFF5y7k71e0UuCju4XDP6e2T01cj1NDVKa4bF5VNgclpYYr5yf4lHiqUEmn9
21H33vm2Knxh5jKJSOZkuFiJUZafoTQWlwNOsCPm/Ctavs1yt6J08But6pov
HSt9vbYgV9o5betlVsgdHGpjPuVSifdZi8OJ8Jtd1dOdCpKmBsfv5Aa2ixbk
NPwNtkcCngeN3O3kHtWg6o3haG/TweeyerCC5XB0ogc7O48e9O5dwJrXqQKM
kXmL4tAaeGvawd91XTvo67QDzu/rBUQI0uUnYYQ+vAYl9M6dTeZvnwu5f3y4
c//DB1FEfsbLv6b1I9BcFy5uFba3DsGfrMPuX3P1U8tPatwL+8EluRd2GcY6
mxd5gNWHSNtr99xlbSf+4UpTvCTotco2H4za9QObX9ZVUqDDgRNJLTDFtuCr
ZyenlL1eq+krcw7bszm8S20JZ1eZzqaZ64wSqZGJXgokm1q6i5JsCfrI9z0o
nOZZkAYTMc6q0tlaeM20oMI1KJBk68yvpqBqkgocdXNlTVcCelwi31JFg1E/
1Qj+C29T1tqRS1r7y9j8BowKwq+0z76dsZksQbX9M+NxB9j6RDp2fAyQb3S5
ob/G4L4tCPYLV0T9qdcrXwGBl/F3S7b4bZinH7WXMeAnk9hLCtQI3bbd7Ayx
b8qveN9ey2aTIlK5nJlag70Z+vrI1DpAcIyqqhDB1CJJbPLT18qUtUoAK1Wn
5M8EjfZVtjoicEiClxDTB2hyBUVVCasqymlbL3Bva4uELniJaC0DmH29xwG3
rX4ta6/VcK/jTawEhNU10XCNmExU7ox1RV9y7ijXzs9pxLRnqoU74gYC3LND
AEdNr7HBVOmX1JoskXtnpIOU14Kq1vnJqCuz8hs/cVs7HgSzSukt6dJm2TqH
LPFmfDIPLdfsS6sWw8QAG6W6ydaadbn9Xq/Nprpors32qy+lfVWRkcViGx3B
/GJdh+/y9l7SOxSDrV9l3LChnN7rb+Xa+nWBInqN8uNeDXcUwU0t5aPw8lWB
OBdjU8qNa2HBBZxL4HabZ5IKs7QphUfB6FOTucEuR4Br7DkiRwrIJQRSK5Vw
EUaGZqUWZJTocr53KnhcqyO/Bo26ve6deiMwdEDCURJTRzNY67KeOYG76JK/
Jluh61cl5bqS/SPl/II4FiIe6qjsBW3OpJihjYXpFG62hVkprn9xhcB5LdOV
aa4qrEW/i8rUsWZ2HCUTaoVIuX8mVrGdUd877PyxWHITKSrExfgBpb+mWPlJ
iAb+gHNfCJmxTYi9xZp47XSrFkpIQ14TtNJFwP41Pr84a/a6IlQpm2Znn9r4
RSxFA+yVM1756Ua4P5lHiRlcpZTIVRbuf+gHBozU/bsCwbJKOy0iKrq5w3VX
VSWSy5IBUNe6JPncBiO5MMNlg1V1R1xvT6d5uO8FJ65rydJVWUFGN3x3uL8R
Y5IK/uwNNvUPesAumfByypnXAEMxkRYevGbVK0q2DktTqaok5wmwG16e1/Vd
POgqv5HYudktUwAk4FMm/rkeLVw576QRZ/ehjsnrDZ2wbLlsuOeEjC+AsZJZ
uq143fhQO6VUOcvVY40uH3x84pLZvKqndq0MXEW4zAPHmkntq6KN2ekUD5q4
hlzGUohRBnvmyv/Lpi2YNsh1tRXHhzElBfbZiKGi1+wav6uLrSFxK643lKQ+
lmqNmqYS0hUPAWyI2gIGT9j3JuS6ZKrnsP2I9OUwxgIuKhfnk+DGDVmZA+Ic
OzQUsFcjWgS/gQymmz0CoEx4yROFRbBLy/2GOFcEtYAhjNGUYMGktXQR7HfU
nvix2VXUP0BktNQz8+f1bJBNoZsFNW1YgmRYprRpLwtatkFUiYWOVTzSz6Wm
ziW8VS+3hh1Yv5sbbw7d76kt+QsLfbHrgLc9kBwD0IstOeDwNPSyeGpifC01
XU68XpRYr/Ha6em9k+Pjgz2sdPjl7OAF/Aar9Rq5Ql67QtRCrra/UW8gLTaQ
20yGJmNZc21l41XpNLbR8lJu/PjeBEODpb0lPR8k0cg4hcL56gZLJUCxF6jC
aUeAuu0G6vZO9g/2Pgtx+IqaqKiGFJBuivcQQS3p8e2LlLIVTUdEH9m4OTVY
lKgHkxHnV6XcsgkDKllWSN6rYWP4+4wTfJkpuZspdQ1kGxU7IABZWfeiWcmE
KWLlIDoGHpKzMYHlNZWq4pL2jXfvsAy9xxfOJKfxE0704qPbWBflH+3DigNq
jcq6qtmmDFOJa1JNYs+S/FnEWK+BPSDg17IZHqMmc607XbPXyKqSmCxPSa1g
MSNcX3BIPwwuumJ8gFguUuqO2kJEwtetsAUhODdpULs0IC19DbSEXzrIkZ0U
KWqmPWkm43pB1j0P1GEcV8zcAM/EFkdkva/SGi49oSZ9tOj05yjdLbWIyFid
oV0klvqYEhKlY1/mmusQac6w9TBaKFXiRvYJa163pFQ6+fnjqiazvI4iPr0M
5S0xlfRcgvMvUskb4cR416Gh5mK67nmsHvjOAXuTAMFP5lVuhwzxczm7uqp/
r6rjeYavwAGh3bXmqLLfFUi5mW3bgBMK2Op5eDxcIyTsaL5LdWkVbwN6Z6CR
01XXV3jVQ9/SZsZV7qXrdz0Gyw7XHJb1U8Rd6t0uV+Lb4HFnCjrRdj64PsZk
dISsVagRLaBzQb1nbLLErvCvI3A9RvNiHIJHc4z1WfoZbLurj0LYsn5pfjdA
6V31xMYw6mWR4FdgoZ4V02lX/5RQtRuOOs+Slbx7U/TVPwCb6WObeV4AAA==

-->

</rfc>
