Batch Operations
Efficiently process multiple documents in a single request.
Batch Create
JavaScript
// Create multiple documents at once
const users = await db.batchCreate('users', [
{ name: 'Alice', email: 'alice@example.com' },
{ name: 'Bob', email: 'bob@example.com' },
{ name: 'Charlie', email: 'charlie@example.com' }
]);
console.log(`Created ${users.length} users`);Performance Tip
Batch operations are significantly faster than individual operations. Use them when creating or updating multiple documents.
Batch Update
JavaScript
// Update multiple documents
const updates = [
{ id: 'doc_1', data: { status: 'active' } },
{ id: 'doc_2', data: { status: 'active' } },
{ id: 'doc_3', data: { status: 'active' } }
];
const result = await db.batchUpdate('users', updates);
console.log(`Updated ${result.modified} documents`);Batch Delete
JavaScript
// Delete multiple documents
const ids = ['doc_1', 'doc_2', 'doc_3'];
await db.batchDelete('users', ids);
console.log(`Deleted ${ids.length} documents`);Processing Large Datasets
Chunked Processing
async function importLargeDataset(data) {
const chunkSize = 1000;
for (let i = 0; i < data.length; i += chunkSize) {
const chunk = data.slice(i, i + chunkSize);
await db.batchCreate('collection', chunk);
console.log(`Processed ${Math.min(i + chunkSize, data.length)}/${data.length}`);
}
}
// Import 100,000 documents efficiently
await importLargeDataset(largeDataArray);Performance Comparison
Individual Operations
1,000 documents~2.5s
10,000 documents~25s
Batch Operations
1,000 documents~0.3s
10,000 documents~3s