Previous | Contents | Index |
Example 8-1 defines <GOBL-form> , the fictitious RIFF form of type GOBL .
Example 8-1 Type GOBL RIFF Form Definition |
---|
<GOBL-form> ---> RIFF( 'GOBL' // RIFF form header [<org-ck>] // Origin chunk (def. (0,0,0)) <obj-list> ) // Series of graphical objects <org-ck> ---> org( <origin:3D_POINT> ) // Origin of object list <obj-list> ---> LIST( 'obj' { <sqr-ck> | // An object is square <circ-ck> | // or a circle <poly-ck> }...) // or a polygon <sqr-ck> ---> sqr( <pt1:3D_POINT> // a vertex <pt2:3D_POINT> // a second vertex <pt3:3D_POINT> ) // a third vertex <circ-ck> ---> circ( <center:3D_POINT> // center of circle <circumPt:3D_POINT> ) // pt. on circumference <poly-ck> ---> poly( <pt:3D_POINT>...) // list of pts in polygon <3D_POINT> ---> struct // defined in "gobl.h" { INT x; // x coordinate INT y; // y coordinate INT z; // z coordinate } 3D_POINT |
Example 8-2 shows a sample RIFF form that adheres to the form definition for form type GOBL . The example contains three subchunks: an INFO list, an org chunk, and an obj chunk.
Example 8-2 RIFF Form Adhering to Type GOBL RIFF Form Definition |
---|
RIFF( 'GOBL' LIST('INFO' // INFO list: filename and copyright INAM("A House"Z) ICOP("(C) Copyright Joe Inc. 1991"Z) ) org (2, 0, 0) // Origin of object list LIST('obj' // Object list: two polygons poly(0,0,0 2,0,0 2,2,0 1,3,0 0,2,0) poly(0,0,5 2,0,5 2,2,5 1,3,5 0,2,5) ) ) // End of form |
The RIFF format is a tagged-file structure and is the preferred format for multimedia files.
Table 8-6 shows the multimedia file types and formats read by Multimedia Services for OpenVMS.
File Type | File Format | Description |
---|---|---|
WAVE | .wav | Audio file |
AVI | .avi | Audio/video interleaved sequence |
See Section 6.3 for more information about the functions used to support file I/O to RIFF files.
8.4 WAVE File Format
The WAVE file format contains audio data encapsulated in chunks inside
a RIFF file. The
<fmt-ck>
(waveform format) chunk and the
<data-ck>
(waveform data) chunk are mandatory in a WAVE file. The
<fmt-ck>
chunk must always occur before the
<data-ck>
chunk. As with all RIFF files, application programs must expect and
ignore any unknown chunks.
Example 8-3 shows a WAVE format file.
Example 8-3 WAVE Format File |
---|
<WAVEFORM> ---> RIFF( 'WAVE' /* format type */ <fmt-ck> /* waveform format */ <data-ck> ) /* waveform data */ <fmt-ck> ---> fmt( <wave-format> /* WAVEFORMAT structure */ <format-specific> ) /* Dependent on format cat */ <wave-format> ---> struct { WORD wFormatTag; /* format category */ WORD nChannels; /* number of channels */ DWORD nSamplesPerSec; /* sampling rate */ DWORD nAvgBytesPerSec; /* for buffering */ WORD nBlockAlign; /* block alignmen */ } <PCM-format-specific> ---> struct { UINT nBitsPerSample; /* Sample size */ } <data-ck> ---> data( <wave-data> ) |
Use the multimedia file I/O functions described in Section 6.6 to get wave-format information from a WAVE file. Specifically, use the mmioDescend function to locate the fmt chunk containing the format information. Then, use the mmioRead function to read the format chunk directly into the proper format structure.
Example 8-4 shows how to access format information from a WAVE file using the mmioDescend and mmioRead functions.
Example 8-4 Obtaining Format Information from a WAVE File |
---|
void ReversePlay() { HMMIO hmmio; MMCKINFO mmckinfoParent; MMCKINFO mmckinfoSubchunk; DWORD dwFmtSize; char szFileName[ MAX_FILENAME_SIZE ]; WAVEFORMAT *pFormat; . . . /* Open the given file for reading. */ . . . /* Locate a "RIFF" chunk with a "WAVE" form type */ /* to make sure it's a WAVE file. */ . . . /* Now find the format chunk (form type "fmt "). It should be */ /* a subchunk of the "RIFF" PARENT CHUNK. */ mmckinfoSubchunk.ckid = mmioFOURCC('f', 'm', 't', ' '); if (mmioDescend(hmmio, &mmckinfoSubchunk, &mmckinfoParent, MMIO_FINDCHUNK)) { fprintf(stderr, "WAVE file is corrupted."); mmioClose(hmmio, 0); return; } /* Get the size of the format chunk, and allocate memory */ /* for it */ dwFmtSize = mmckinforSubchunk.cksize; pFormat = (WAVEFORMAT *) mmeAllocMem(LOWOR(dwFmtSize)); if (!pFormat) { fprintf(stderr, "Failed to allocate memory for format chunk."); mmeFreeMem(pFormat); mmioClose(hmmio, 0); return; } /* Read the format chunk. */ if (mmioRead(hmmio, pFormat, dwFmtSize) != dwFmtSize) { fprintf(stderr, "Failed to read format chunk."); mmeFreeMem(pFormat); mmioClose(hmmio, 0); return; } /* Make sure it is a PCM file. */ if (pFormat->wFormatTag != WAVE_FORMAT_PCM) { mmeFreeMem(pFormat); mmioClose(hmmio, 0); fprintf(stderr, "The file is not a PCM file."); return; } /* Make sure the system has a waveform output */ /* device capable of playing this format. */ if (waveOutOpen(&hWaveOut, WAVE_MAPPER, (LPWAVEFORMAT)pFormat, NULL, 0L, WAVE_FORMAT_QUERY)) { mmeFreeMem(pFormat); mmioClose(hmmio, 0); fprintf(stderr, "The waveform device cannot play this format."); return; } . . . } |
The fmt chunk contains the <wave-format> field, which includes a WAVEFORMAT data structure that specifies the format of the data contained in the data chunk.
The <wave-format> field (WAVEFORMAT data structure) contains the following fields:
wFormatTag
Specifies flags indicating the WAVE format category of the file. The
content of the
<format-specific>
field in the
fmt
chunk and the interpretation of data in the
data
chunk depend on this value. The following flags are defined:
WAVE_FORMAT_PCM
Indicates that the file is in PCM format.
WAVE_FORMAT_MULAW
Indicates that the file is in MULAW format.
nChannels
Specifies the number of channels represented in the
data
chunk, such as 1 for mono or 2 for stereo.
nSamplesPerSec
Specifies the sampling rate (in samples per second) at which each
channel is to be played back.
nAvgBytesPerSec
Specifies the average number of bytes per second at which data in the
data
chunk is to be transferred. If the value of the
wFormatTag field is WAVE_FORMAT_PCM, then the
nAvgBytesPerSec field is equal to the following:
nChannels * nSamplesPerSec * nBitsPerSample / 8
Playback software can estimate the buffer size using the value of the nAvgBytesPerSec field.
nBlockAlign
Specifies the block alignment (in bytes) of the data in the
data
chunk. If the wFormatTag field is set to
WAVE_FORMAT_PCM, then the value of the nBlockAlign
field is equal to the following:
nBitsPerSample * nChannels / 8
Applications that play back audio must process a multiple of the value of the nBlockAlign field bytes of data at a time so that the value of the nBlockAlign field can be used for buffer alignment.
The fmt chunk also contains the <format-specific> field. When the wFormatTag field of the <wave-format> structure is WAVE_FORMAT_PCM (for PCM format data), then the <format-specific> field contains the nBitsPerSample field, which specifies the number of bits of data used to represent each sample. See Section 8.4.3 for more information about PCM format data.
Applications that play back audio must be written to allow for and to
ignore any unknown
<format-specific>
arguments that occur at the end of this field.
8.4.3 WAVE Format Categories
The format category of a WAVE file is specified by the value of the wFormatTag field in the fmt chunk. The representation of data in the data chunk and the content of the <format-specific> field in the fmt chunk depend on the format category.
Currently, two waveform audio format categories are supported by Multimedia Services for OpenVMS: pulse code modulation (PCM) and MULAW. For PCM data, use the WAVE_FORMAT_PCM flag as the value of the wFormatTag field. For MULAW data, use the WAVE_FORMAT_MULAW flag as the value of the wFormatTag field.
The data chunk contains samples represented in PCM format. For WAVE files in this category, the <format-specific> field of the fmt chunk contains a <PCM-format-specific> structure that has a single field, nBitsPerSample.
The nBitsPerSample field specifies the number of bits of data used to represent each sample of each channel. If there are multiple channels, the sample size is the same for each channel.
Each sample is contained in an integer ( i ). The size of i is the smallest number of bytes required to contain the specified sample size. The least significant byte is stored first. The bits that represent the sample amplitude are stored in the most significant bits of i and the remaining bits are set to zero.
For example, if the sample size (contained in the nBitsPerSample field) is 12 bits, then each sample is stored in a two-byte integer. The least significant four bits of the first (least significant) byte are set to zero.
See Table 3-1 for more information about the data format and maximum, minimum, and midpoint values for PCM waveform samples of various sizes.
In a single-channel WAVE file, samples are stored consecutively. For stereo WAVE files, channels are represented as follows:
The speaker-position mapping for more than two channels is currently undefined. In multiple-channel WAVE files, samples are interleaved.
See Chapter 3 for information about data packing for mono and
stereo WAVE files.
8.4.4 WAVE File Examples
Example 8-5 presents a PCM WAVE file that has an 11.025 kHz sampling rate, mono, 8 bits/sample.
Example 8-5 PCM WAVE File: 11.025 kHz Sampling Rate, Mono, 8 Bits/Sample |
---|
RIFF('WAVE' fmt)1, 1, 11025, 11025, 1, 8) data( <wave-data> ) ) |
Example 8-6 presents a PCM WAVE file that has a 22.05 kHz sampling rate, mono, 8 bits/sample.
Example 8-6 PCM WAVE File: 22.05 kHz Sampling Rate, Mono, 8 Bits/Sample |
---|
RIFF('WAVE' fmt)1, 2, 22050, 44100, 2, 8) data( <wave-data> ) ) |
Example 8-7 presents a PCM WAVE file that has a 44.1 kHz sampling rate, mono, 20 bits/sample.
Example 8-7 PCM WAVE File: 44.1 kHz Sampling Rate, Mono, 20 Bits/Sample |
---|
RIFF('WAVE' INFO(INAME("0 Canada"Z)) fmt(1, 1, 44100, 132300, 3, 20) data( <wave-data> ) ) |
The Audio/Video Interleaved (AVI) file format is a RIFF file specification used with applications that capture, edit, and play back audio/video sequences. AVI files use the AVI RIFF form. The AVI RIFF form is identified by the four-character code "AVI ". A four-character code is a 32-bit quantity representing a sequence of one to four ASCII alphanumeric characters, padded on the right with blank characters.
All AVI files include two mandatory LIST subchunks: 'hdrl' and 'movi' . The 'hdrl' subchunk defines the format of the data. The 'movi' subchunk contains the data for the AVI sequence. The 'hdrl' subchunk must be positioned first in a LIST chunk. The 'hdrl' subchunk is followed by the 'movi' subchunk. The index chunk, 'idx1' , is optional and specifies the location of data chunks within the file.
Example 8-8 shows an AVI RIFF file expanded with the chunks needed to complete the LIST 'hdrl' and 'movi' subchunks.
Example 8-8 Expanded AVI RIFF File |
---|
RIFF ( 'AVI ' LIST ('hdrl' 'avih'(<MAIN AVI Header>) LIST ('strl' 'strh'(<Stream header>) 'strf'(<Stream format>) 'strd'(additional header data) . . . ) . . . ) LIST ('movi' {SubChunk | LIST('rec ' SubChunk1 SubChunk2 . . . ) . . . } . . . ) ['idx1'<AVIIndex>] ) |
The function prototypes, constants, flags, and data structures used in
the RIFF chunks are defined in the
avifmt.h
header file. The following data structures are included in the header
file: MainAVIHeader, AVIStreamHeader,
AVIPALCHANGE, and AVIINDEXENTRY.
8.6.1 MainAVIHeader Data Structure
An AVI file begins with the main header. This header is identified with the avih four-character code. The header contains general information about the file, such as the number of streams within the file and the width and height of the AVI sequence.
Example 8-9 shows the MainAVIHeader data structure definition.
Example 8-9 MainAVIHeader Data Structure Definition |
---|
typedef struct { DWORD dwMicroSecPerFrame; /* frame display rate */ DWORD dwMaxBytesPerSec; /* maximum transfer rate in bytes/sec */ DWORD dwPaddingGranularity /* pad to multiples of this size */ DWORD dwFlags; /* flags specifying info about AVI file */ DWORD dwTotalFrames; /* number of frames in AVI file */ DWORD dwInitialFrames; /* amount audio stream is skewed */ DWORD dwStreams; /* number of streams in the AVI file */ DWORD dwSuggestedBufferSize; /* buffer size for reading AVI file */ DWORD dwWidth; /* width of the AVI file in pixels */ DWORD dwReserved[4]; /* reserved */ } MainAVIHeader; |
The MainAVIHeader data structure has the following fields:
dwMicroSecPerFrame
Specifies the frame display rate. This value indicates the overall
timing for the file.
dwMaxBytesPerSec
Specifies the approximate maximum data transfer rate of the file. This
value indicates the number of bytes per second the system must handle
to present an AVI sequence as specified by the other parameters
contained in the main header and stream header chunks.
dwPaddingGranularity
Specifies the padding size. All padding is done to multiples of this
size. This is normally 2 kB.
For CD-ROM usage, audio data has to be separated into single-frame pieces, and audio and video for each frame needs to be grouped into record ( 'rec' ) chunks. The record chunks must be padded so that their sizes are multiples of 2 kB, and the beginning of the actual data in the LIST chunk lies on a 2-kB boundary in the file. |
dwFlags
Specifies any applicable flags. The following flags are defined:
AVIF_HASINDEX
Indicates that the AVI file has an 'idx1' chunk.
AVIF_MUSTUSEINDEX
Indicates that the index must be used to determine the order of presentation of the data.
AVIF_ISINTERLEAVED
Indicates that the AVI file is interleaved.
AVIF_WASCAPTUREFILE
Indicates that the AVI file is a specially allocated file used for capturing real-time video.
AVIF_COPYRIGHTED
Indicates that the AVI file contains copyrighted data.
dwTotalFrames
Specifies the total number of frames of data in the file.
dwInitialFrames
Specifies the number of frames in the file prior to the initial frame
of the AVI sequence in this field. Use this field when creating
interleaved files to indicate how far ahead of the video data the audio
data is skewed.
dwStreams
Specifies the number of streams in the file. For example, a file with
audio and video has two streams.
dwSuggestedBufferSize
Specifies the suggested buffer size for reading the file. Generally,
this size should be large enough to contain the largest chunk in the
file. If set to zero, or if it is too small, the playback software must
reallocate memory during playback, which reduces performance. For an
interleaved file, the buffer size must be large enough to read an
entire record and not just a chunk.
dwWidth
Specifies the width of the frame in the AVI file in pixels.
dwHeight
Specifies the height of the frame in the AVI file in pixels.
dwReserved[4]
Reserved.
8.6.2 AVIStreamHeader Data Structure
The AVIStreamHeader data structure contains header information for a single stream of a file. The main header is followed by one or more 'strl' chunks. A 'strl' chunk is required for each data stream. These chunks contain information about the streams in the file. Each 'strl' chunk must contain a stream header and stream format chunk. Stream header chunks are identified by the four-character code 'strh' and stream format chunks are identified by the four-character code 'strf' .
In addition to the stream header and stream format chunks, the 'strl' chunk might also contain a stream data chunk. Stream data chunks are identified by the four-character code 'strd' .
Example 8-10 shows the AVIStreamHeader data structure definition.
Example 8-10 AVIStreamHeader Data Structure Definition |
---|
typedef struct { FOURCC fccType; /* type of data in the stream */ FOURCC fccHandler; /* the specific data handler */ DWORD dwFlags; /* information about the data stream */ DWORD dwPriority; /* reserved field */ DWORD dwInitialFrames; /* audio data skew */ DWORD dwScale; /* scaling value used with dwRate */ DWORD dwRate; /* video frame rate or audio sample rate */ DWORD dwStart; /* starting time of AVI file */ DWORD dwLength; /* length of AVI file */ DWORD dwSuggestedBufferSize; /* maximum buffer size */ DWORD dwQuality; /* quality of video stream */ DWORD dwSampleSize; /* sample size */ DUMMYRECT rcFrame /* text or video stream destination */ } AVIStreamHeader; |
The AVIStreamHeader data structure has the following fields:
fccType
Contains a four-character code that specifies the type of data
contained in the stream. The following values are currently defined for
AVI data:
'vids'
Indicates that the stream contains video data. The stream format chunk contains a BITMAPINFOHEADER data structure that can include palette information.
'auds'
Indicates that the stream contains audio data. The stream format chunk contains a WAVEFORMAT or a PCMWAVEFORMAT data structure.
Other four-character codes identify non-AVI data.
fccHandler
Contains a four-character code that identifies a specific data handler.
The data handler is the preferred handler for the stream.
dwFlags
Specifies any applicable flags. The bits in the high-order word of
these flags are specific to the type of data contained in the stream.
The following flags are currently defined:
AVISF_DISABLED
Indicates that this stream must not be enabled by default.
AVISF_VIDEO_PALCHANGES
Indicates that this video stream contains palette changes. This flag warns the playback software that it will need to animate the palette.
dwPriority
Reserved field. Set to 0.
dwInitialFrames
Specifies how far audio data is skewed ahead of the video frames in
interleaved files. This is typically about 0.75 seconds. Set this value
equal to the value of the dwInitialFrames field in the
MainAVIHeader data structure.
dwScale
Is used with the value of the dwRate field to specify
the time scale that this stream uses. Divide the
dwRate value by the dwScale value to
determine the number of video frames or audio samples per second.
dwRate
Specifies the number of video frames or audio samples per second. For
video streams, this rate is the frame rate. For audio streams
containing PCM data, this rate is the sample rate.
The dwRate value is divided by the dwScale value to determine the number of video frames or audio samples per second. This is useful for specifying nonintegral frame rates or sample rates. For example, to produce a video frame rate of 29.97 frames per second, an application can set the dwRate field to 2997 and the dwScale field to 100.
dwStart
Specifies the starting time of the AVI file. Normally, this time is
zero but use the value of the dwStart field to specify
a delay time for a stream that does not start concurrently with the
file.
For CD-ROM usage, the audio stream must be skewed approximately 0.75 seconds ahead of the video stream to allow prebuffering of data to reduce or eliminate the chance for gaps in the audio playback. Compaq recommends that all AVI files be encoded this way. |
dwLength
Specifies the length of this stream. The units are defined by the
dwRate and dwScale fields of the
stream's header.
Previous | Next | Contents | Index |