Task 153: .DRC File Format
Task 153: .DRC File Format
.DRC File Format Specifications
The .DRC file format refers to the Google Draco compressed 3D graphics format, which is used for efficient storage and transmission of 3D meshes and point clouds. It is a binary format developed by Google as an open-source library. The specifications are detailed in the Draco Bitstream Specification (available at https://google.github.io/draco/spec/). The format includes a header, optional metadata, connectivity data (depending on the encoding method), and attribute data.
1. List of All Properties Intrinsic to the File Format
Based on the bitstream specification, the following are the key properties and fields that can be extracted from a .DRC file. These include header fields, metadata structures, connectivity details, and attribute descriptors. Note that some properties are conditional based on flags, encoder type, and method. The format uses little-endian byte order for integers and IEEE 754 for floats. Variable-length integers use LEB128 encoding.
Header Properties:
- Magic string: 5-byte ASCII "DRACO"
- Major version: 8-bit unsigned integer (UI8)
- Minor version: UI8
- Encoder type: UI8 (0 = POINT_CLOUD, 1 = TRIANGULAR_MESH)
- Encoder method: UI8 (0 = MESH_SEQUENTIAL_ENCODING, 1 = MESH_EDGEBREAKER_ENCODING)
- Flags: 16-bit unsigned integer (UI16); bit 15 (METADATA_FLAG_MASK = 32768) indicates presence of metadata
Metadata Properties (if METADATA_FLAG_MASK is set in flags):
- Number of attribute metadata: Variable unsigned 32-bit integer (varUI32)
- Attribute metadata IDs: Array of varUI32 (one per attribute metadata)
- For each attribute metadata element:
- Number of entries: varUI32
- For each entry:
- Key size: UI8
- Key: Array of I8 (signed 8-bit) of key size length
- Value size: UI8
- Value: Array of I8 of value size length
- Number of sub-metadata: varUI32
- Sub-metadata: Recursive metadata elements (same structure)
- File-level metadata: Same structure as attribute metadata (parsed after attribute metadata)
Connectivity Properties (depends on encoder_method):
- If MESH_SEQUENTIAL_ENCODING (0):
- Number of faces: varUI32
- Number of points: varUI32
- Connectivity method: UI8 (0 = compressed indices, 1 = uncompressed indices)
- Indices: Variable based on num_points (UI8/UI16/varUI32/UI32 per index, or compressed differentials)
- If MESH_EDGEBREAKER_ENCODING (1):
- Edgebreaker traversal type: UI8 (0 = STANDARD_EDGEBREAKER, 2 = VALENCE_EDGEBREAKER)
- Number of encoded vertices: varUI32
- Number of faces: varUI32
- Number of attribute data: UI8
- Number of encoded symbols: varUI32
- Number of encoded split symbols: varUI32
- Topology splits: Number of topology splits (varUI32), source ID deltas, split ID deltas, source edge bits
- Symbol data: Encoded symbols (variable bits)
- Attribute connectivity data: For each attribute (num_attribute_data times): number of connectivity corners (varUI32), connectivity deltas
- (If VALENCE_EDGEBREAKER): Valence header, context data for low/high valence
Attribute Properties:
- Number of attribute decoders: UI8
- For each decoder:
- Attribute decoder data ID: UI8
- Attribute decoder type: UI8 (0 = MESH_VERTEX_ATTRIBUTE, 1 = MESH_CORNER_ATTRIBUTE)
- Attribute traversal method: UI8 (0 = DEPTH_FIRST, 1 = PREDICTION_DEGREE)
- Attribute type: UI8 (e.g., 0 = POSITION, 1 = NORMAL, 2 = COLOR, 3 = TEX_COORD, etc.)
- Data type: UI8 (e.g., 9 = FLOAT32, 1 = UINT32, etc.)
- Number of components: UI8 (e.g., 3 for XYZ positions)
- Normalized: UI8 (0 or 1, indicating if values are normalized)
- Unique ID: varUI32
- Prediction scheme: UI8 (e.g., 0 = NONE, 1 = DIFFERENCE, 2 = PARALLELOGRAM)
- Transform type: UI8 (e.g., 0 = NONE, 1 = NORMAL_OCTAHEDRON)
- Additional prediction/transform parameters (e.g., quantization bits, min/max values for quantized data)
These properties define the structure and content of the 3D data, excluding the raw encoded attribute and connectivity bits (which are data, not properties).
2. Two Direct Download Links for .DRC Files
- https://raw.githubusercontent.com/google/draco/master/testdata/test_nm_edgebreaker.111.drc
- https://raw.githubusercontent.com/google/draco/master/testdata/test_nm_point_cloud.drc
These are sample test files from the official Google Draco GitHub repository.
3. Ghost Blog Embedded HTML JavaScript for Drag-and-Drop .DRC File Dump
Here's an HTML page with embedded JavaScript that allows dragging and dropping a .DRC file. It parses the file using DataView, extracts the properties listed above, and dumps them to the screen. Note: Full decoding of connectivity and attribute data requires implementing the full bitstream decoder (including LEB128, RANS, etc.), which is complex; this code parses the header, metadata, and high-level connectivity/attribute properties, skipping raw encoded data.
4. Python Class for .DRC File Handling
Here's a Python class that can open, decode (parse), read properties, write a basic .DRC file (header and metadata only for simplicity; full encoding requires the Draco library or custom implementation), and print properties to console. Uses struct
for binary parsing and custom LEB128 reader.
import struct
import sys
class DRCFile:
def __init__(self, filepath=None):
self.filepath = filepath
self.properties = {}
if filepath:
self.decode()
def read_var_ui32(self, data, offset):
value = 0
shift = 0
while True:
byte = data[offset]
offset += 1
value |= (byte & 0x7F) << shift
if (byte & 0x80) == 0:
break
shift += 7
return value, offset
def decode(self):
with open(self.filepath, 'rb') as f:
data = f.read()
offset = 0
# Header
magic = data[offset:offset+5].decode('ascii')
offset += 5
if magic != 'DRACO':
raise ValueError('Invalid DRC file')
self.properties['magic'] = magic
major, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['major_version'] = major
minor, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['minor_version'] = minor
encoder_type, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['encoder_type'] = 'POINT_CLOUD' if encoder_type == 0 else 'TRIANGULAR_MESH'
encoder_method, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['encoder_method'] = 'MESH_SEQUENTIAL_ENCODING' if encoder_method == 0 else 'MESH_EDGEBREAKER_ENCODING'
flags, = struct.unpack('<H', data[offset:offset+2])
offset += 2
self.properties['flags'] = flags
has_metadata = bool(flags & 32768)
self.properties['has_metadata'] = has_metadata
# Metadata
if has_metadata:
num_att_meta, offset = self.read_var_ui32(data, offset)
self.properties['num_att_metadata'] = num_att_meta
att_meta_ids = []
for _ in range(num_att_meta):
id_val, offset = self.read_var_ui32(data, offset)
att_meta_ids.append(id_val)
self.properties['att_metadata_ids'] = att_meta_ids
# Parse metadata (simplified, recursive for sub)
def parse_metadata():
props = {}
num_entries, offset_local = self.read_var_ui32(data, offset)
props['num_entries'] = num_entries
entries = []
for _ in range(num_entries):
key_size = data[offset_local]
offset_local += 1
key = data[offset_local:offset_local + key_size].decode('utf-8', errors='ignore')
offset_local += key_size
value_size = data[offset_local]
offset_local += 1
value = data[offset_local:offset_local + value_size].decode('utf-8', errors='ignore')
offset_local += value_size
entries.append((key, value))
props['entries'] = entries
num_sub, offset_local = self.read_var_ui32(data, offset_local)
props['num_sub_metadata'] = num_sub
sub_meta = []
for _ in range(num_sub):
sub, offset_local = parse_metadata()
sub_meta.append(sub)
props['sub_metadata'] = sub_meta
return props, offset_local
att_meta, offset = parse_metadata()
self.properties['att_metadata'] = att_meta
file_meta, offset = parse_metadata()
self.properties['file_metadata'] = file_meta
# Connectivity (partial)
if encoder_method == 0:
num_faces, offset = self.read_var_ui32(data, offset)
num_points, offset = self.read_var_ui32(data, offset)
conn_method, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['num_faces'] = num_faces
self.properties['num_points'] = num_points
self.properties['connectivity_method'] = 'Compressed' if conn_method == 0 else 'Uncompressed'
elif encoder_method == 1:
traversal_type, = struct.unpack('<B', data[offset:offset+1])
offset += 1
num_encoded_verts, offset = self.read_var_ui32(data, offset)
num_faces, offset = self.read_var_ui32(data, offset)
num_att_data, = struct.unpack('<B', data[offset:offset+1])
offset += 1
num_symbols, offset = self.read_var_ui32(data, offset)
num_splits, offset = self.read_var_ui32(data, offset)
self.properties['traversal_type'] = 'STANDARD' if traversal_type == 0 else 'VALENCE'
self.properties['num_encoded_vertices'] = num_encoded_verts
self.properties['num_faces'] = num_faces
self.properties['num_attribute_data'] = num_att_data
self.properties['num_encoded_symbols'] = num_symbols
self.properties['num_encoded_split_symbols'] = num_splits
# Attributes (partial)
num_att_dec, = struct.unpack('<B', data[offset:offset+1])
offset += 1
self.properties['num_attribute_decoders'] = num_att_dec
att_decs = []
for _ in range(num_att_dec):
data_id, = struct.unpack('<B', data[offset:offset+1])
offset += 1
dec_type, = struct.unpack('<B', data[offset:offset+1])
offset += 1
trav_method, = struct.unpack('<B', data[offset:offset+1])
offset += 1
att_type, = struct.unpack('<B', data[offset:offset+1])
offset += 1
data_type, = struct.unpack('<B', data[offset:offset+1])
offset += 1
num_comp, = struct.unpack('<B', data[offset:offset+1])
offset += 1
normalized, = struct.unpack('<B', data[offset:offset+1])
offset += 1
unique_id, offset = self.read_var_ui32(data, offset)
att_decs.append({
'data_id': data_id,
'decoder_type': dec_type,
'traversal_method': trav_method,
'att_type': att_type,
'data_type': data_type,
'num_components': num_comp,
'normalized': normalized,
'unique_id': unique_id
})
self.properties['attribute_decoders'] = att_decs
def print_properties(self):
for key, value in self.properties.items():
print(f"{key}: {value}")
def write(self, output_path):
# Basic write: header and empty metadata (full write requires encoding data)
data = b''
data += b'DRACO'
data += struct.pack('<B', self.properties.get('major_version', 1))
data += struct.pack('<B', self.properties.get('minor_version', 0))
encoder_type = 0 if self.properties.get('encoder_type', 'POINT_CLOUD') == 'POINT_CLOUD' else 1
data += struct.pack('<B', encoder_type)
encoder_method = 0 if self.properties.get('encoder_method', 'MESH_SEQUENTIAL_ENCODING') == 'MESH_SEQUENTIAL_ENCODING' else 1
data += struct.pack('<B', encoder_method)
data += struct.pack('<H', self.properties.get('flags', 0))
# Add metadata/connectivity/attributes as needed (stubbed)
with open(output_path, 'wb') as f:
f.write(data)
# Example usage
if __name__ == '__main__':
if len(sys.argv) > 1:
drc = DRCFile(sys.argv[1])
drc.print_properties()
5. Java Class for .DRC File Handling
Here's a Java class that can open, decode, read, write (basic header/metadata), and print properties. Uses DataInputStream
for parsing.
import java.io.*;
import java.nio.ByteBuffer;
import java.nio.ByteOrder;
public class DRCFile {
private String filepath;
private String magic;
private int majorVersion;
private int minorVersion;
private String encoderType;
private String encoderMethod;
private int flags;
private boolean hasMetadata;
// Add other properties as fields...
public DRCFile(String filepath) {
this.filepath = filepath;
if (filepath != null) {
decode();
}
}
private int readVarUI32(DataInputStream dis) throws IOException {
int value = 0;
int shift = 0;
while (true) {
int byteVal = dis.readUnsignedByte();
value |= (byteVal & 0x7F) << shift;
if ((byteVal & 0x80) == 0) break;
shift += 7;
}
return value;
}
public void decode() {
try (FileInputStream fis = new FileInputStream(filepath);
DataInputStream dis = new DataInputStream(fis)) {
byte[] magicBytes = new byte[5];
dis.readFully(magicBytes);
magic = new String(magicBytes);
if (!magic.equals("DRACO")) {
throw new IOException("Invalid DRC file");
}
majorVersion = dis.readUnsignedByte();
minorVersion = dis.readUnsignedByte();
int encType = dis.readUnsignedByte();
encoderType = (encType == 0) ? "POINT_CLOUD" : "TRIANGULAR_MESH";
int encMethod = dis.readUnsignedByte();
encoderMethod = (encMethod == 0) ? "MESH_SEQUENTIAL_ENCODING" : "MESH_EDGEBREAKER_ENCODING";
flags = dis.readUnsignedShort();
hasMetadata = (flags & 32768) != 0;
// Metadata, connectivity, attributes parsing similar to Python (implement recursively for metadata)
// For brevity, stubbed; extend as needed
} catch (IOException e) {
e.printStackTrace();
}
}
public void printProperties() {
System.out.println("Magic: " + magic);
System.out.println("Major Version: " + majorVersion);
System.out.println("Minor Version: " + minorVersion);
System.out.println("Encoder Type: " + encoderType);
System.out.println("Encoder Method: " + encoderMethod);
System.out.println("Flags: " + flags);
System.out.println("Has Metadata: " + hasMetadata);
// Print other properties...
}
public void write(String outputPath) throws IOException {
try (FileOutputStream fos = new FileOutputStream(outputPath)) {
fos.write("DRACO".getBytes());
fos.write(majorVersion);
fos.write(minorVersion);
int encType = "POINT_CLOUD".equals(encoderType) ? 0 : 1;
fos.write(encType);
int encMethod = "MESH_SEQUENTIAL_ENCODING".equals(encoderMethod) ? 0 : 1;
fos.write(encMethod);
ByteBuffer bb = ByteBuffer.allocate(2).order(ByteOrder.LITTLE_ENDIAN).putShort((short) flags);
fos.write(bb.array());
// Add more for full write
}
}
public static void main(String[] args) {
if (args.length > 0) {
DRCFile drc = new DRCFile(args[0]);
drc.printProperties();
}
}
}
Note: Full parsing/write is truncated for brevity; extend with similar logic as Python for metadata/connectivity/attributes.
6. JavaScript Class for .DRC File Handling
Here's a JavaScript class (Node.js compatible) that can open (read from file), decode, read properties, write basic file, and print to console. Uses fs
and Buffer
.
const fs = require('fs');
class DRCFile {
constructor(filepath = null) {
this.filepath = filepath;
this.properties = {};
if (filepath) {
this.decode();
}
}
readVarUI32(buffer, offset) {
let value = 0;
let shift = 0;
while (true) {
const byte = buffer.readUInt8(offset.offset++);
value |= (byte & 0x7F) << shift;
if ((byte & 0x80) === 0) break;
shift += 7;
}
return value;
}
decode() {
const data = fs.readFileSync(this.filepath);
let offset = {offset: 0};
// Header
this.properties.magic = data.toString('ascii', offset.offset, offset.offset += 5);
if (this.properties.magic !== 'DRACO') throw new Error('Invalid DRC file');
this.properties.major_version = data.readUInt8(offset.offset++);
this.properties.minor_version = data.readUInt8(offset.offset++);
const encType = data.readUInt8(offset.offset++);
this.properties.encoder_type = encType === 0 ? 'POINT_CLOUD' : 'TRIANGULAR_MESH';
const encMethod = data.readUInt8(offset.offset++);
this.properties.encoder_method = encMethod === 0 ? 'MESH_SEQUENTIAL_ENCODING' : 'MESH_EDGEBREAKER_ENCODING';
this.properties.flags = data.readUInt16LE(offset.offset); offset.offset += 2;
this.properties.has_metadata = !!(this.properties.flags & 32768);
// Metadata, etc. (similar to HTML JS, implement parseMetadata function)
// Stubbed for brevity
}
printProperties() {
for (const [key, value] of Object.entries(this.properties)) {
console.log(`${key}: ${value}`);
}
}
write(outputPath) {
let buffer = Buffer.alloc(1024); // Allocate and trim later
let pos = 0;
buffer.write('DRACO', pos); pos += 5;
buffer.writeUInt8(this.properties.major_version || 1, pos++);
buffer.writeUInt8(this.properties.minor_version || 0, pos++);
const encType = this.properties.encoder_type === 'POINT_CLOUD' ? 0 : 1;
buffer.writeUInt8(encType, pos++);
const encMethod = this.properties.encoder_method === 'MESH_SEQUENTIAL_ENCODING' ? 0 : 1;
buffer.writeUInt8(encMethod, pos++);
buffer.writeUInt16LE(this.properties.flags || 0, pos); pos += 2;
// Add more
fs.writeFileSync(outputPath, buffer.slice(0, pos));
}
}
// Example
if (process.argv.length > 2) {
const drc = new DRCFile(process.argv[2]);
drc.printProperties();
}
Note: Extend with full parsing like in the HTML example.
7. C Class for .DRC File Handling
Here's a C++ class (using , ) for handling .DRC files. "Class in C" likely means C++, as C uses structs.
#include <iostream>
#include <fstream>
#include <vector>
#include <string>
#include <cstdint>
class DRCFile {
private:
std::string filepath;
std::string magic;
uint8_t major_version;
uint8_t minor_version;
std::string encoder_type;
std::string encoder_method;
uint16_t flags;
bool has_metadata;
// Add other properties...
public:
DRCFile(const std::string& fp = "") : filepath(fp) {
if (!fp.empty()) decode();
}
uint32_t read_var_ui32(std::ifstream& ifs) {
uint32_t value = 0;
int shift = 0;
while (true) {
uint8_t byte;
ifs.read(reinterpret_cast<char*>(&byte), 1);
value |= (byte & 0x7F) << shift;
if ((byte & 0x80) == 0) break;
shift += 7;
}
return value;
}
void decode() {
std::ifstream ifs(filepath, std::ios::binary);
if (!ifs) return;
char mag[6];
ifs.read(mag, 5);
mag[5] = '\0';
magic = mag;
if (magic != "DRACO") {
std::cerr << "Invalid DRC file" << std::endl;
return;
}
ifs.read(reinterpret_cast<char*>(&major_version), 1);
ifs.read(reinterpret_cast<char*>(&minor_version), 1);
uint8_t enc_type;
ifs.read(reinterpret_cast<char*>(&enc_type), 1);
encoder_type = (enc_type == 0) ? "POINT_CLOUD" : "TRIANGULAR_MESH";
uint8_t enc_method;
ifs.read(reinterpret_cast<char*>(&enc_method), 1);
encoder_method = (enc_method == 0) ? "MESH_SEQUENTIAL_ENCODING" : "MESH_EDGEBREAKER_ENCODING";
ifs.read(reinterpret_cast<char*>(&flags), 2);
has_metadata = (flags & 32768) != 0;
// Metadata, etc. parsing (similar logic)
}
void print_properties() {
std::cout << "Magic: " << magic << std::endl;
std::cout << "Major Version: " << static_cast<int>(major_version) << std::endl;
std::cout << "Minor Version: " << static_cast<int>(minor_version) << std::endl;
std::cout << "Encoder Type: " << encoder_type << std::endl;
std::cout << "Encoder Method: " << encoder_method << std::endl;
std::cout << "Flags: " << flags << std::endl;
std::cout << "Has Metadata: " << (has_metadata ? "Yes" : "No") << std::endl;
// Print others...
}
void write(const std::string& output_path) {
std::ofstream ofs(output_path, std::ios::binary);
ofs.write("DRACO", 5);
ofs.write(reinterpret_cast<const char*>(&major_version), 1);
ofs.write(reinterpret_cast<const char*>(&minor_version), 1);
uint8_t enc_type = (encoder_type == "POINT_CLOUD") ? 0 : 1;
ofs.write(reinterpret_cast<const char*>(&enc_type), 1);
uint8_t enc_method = (encoder_method == "MESH_SEQUENTIAL_ENCODING") ? 0 : 1;
ofs.write(reinterpret_cast<const char*>(&enc_method), 1);
ofs.write(reinterpret_cast<const char*>(&flags), 2);
// Add more
}
};
int main(int argc, char** argv) {
if (argc > 1) {
DRCFile drc(argv[1]);
drc.print_properties();
}
return 0;
}
Note: Similar to others, extend for full parsing/write. The codes focus on header and metadata for conciseness, as full bitstream encoding/decoding would require a complete implementation matching the Draco library.