Skip to main content

LoadDataFromFileCommand

Overview

LoadDataFromFileCommand loads data from CSV or Excel files in a specified folder. It supports filtering, searching, column selection, pagination, and streaming for large files. This command is designed for bulk data imports and can be called from BPM processes for automated data loading workflows.

When to Use

Use LoadDataFromFileCommand for:

  • Bulk data imports from CSV/Excel files
  • Automated data loading in BPM processes
  • Processing large datasets (millions of rows with CSV streaming)
  • Extracting specific columns from files
  • Filtering and searching data before import
  • Daily/scheduled file imports
  • ETL (Extract, Transform, Load) workflows

Don't use for:

  • Real-time data entry (use database commands)
  • Single record operations
  • Binary files (use specialized file readers)
  • Very complex transformations (do post-processing)

Performance Guidelines

File TypeBest ForMax RowsMemory UsageSpeed
CSVLarge datasetsMillions~50MB (streaming)~10,000 rows/sec
ExcelSmall datasets<10,000High (in-memory)~2,000 rows/sec

💡 Recommendation: Use CSV for files with >10,000 rows. Use Excel for smaller files with formatting needs.

Syntax

1. Basic Usage - Load Latest CSV File

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Imports',
filePattern: '*.csv'
}
});

// Access the data
var rows = result.Data.rows;
var rowCount = result.Data.rowCount;
var fileName = result.Data.fileName;

// Process each row
for (var i = 0; i < rows.length; i++) {
var row = rows[i];
console.log('Customer: ' + row.CustomerName + ', Amount: ' + row.Amount);
}

2. Select Specific Columns

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Sales',
filePattern: 'Sales_*.csv',
selectColumns: ['CustomerID', 'OrderDate', 'TotalAmount', 'Status']
}
});

// Only specified columns are returned
// Result: [{CustomerID: '123', OrderDate: '2025-01-14', TotalAmount: 5000, Status: 'Completed'}, ...]

3. Column Selection with Rename

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\HR',
filePattern: 'employees.xlsx',
fileType: 'excel',
selectColumns: ['EmployeeID', 'FullName', 'Department', 'Salary'],
columnMapping: {
EmployeeID: 'id',
FullName: 'name',
Department: 'dept',
Salary: 'monthlySalary'
}
}
});

// Columns are renamed
// Result: [{id: 'EMP001', name: 'John Doe', dept: 'IT', monthlySalary: 50000}, ...]

4. Filter and Search Data

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Orders',
filePattern: 'orders_2025*.csv',
searchColumn: 'Status',
searchValue: 'Pending'
}
});

// Only rows where Status contains 'Pending'

5. Pagination for Large Files

// Load first 100 rows
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\BigFile',
filePattern: 'transactions.csv',
skipRows: 0,
takeRows: 100
}
});

// Load next 100 rows
var nextBatch = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\BigFile',
filePattern: 'transactions.csv',
skipRows: 100,
takeRows: 100
}
});

6. Load Specific Excel Sheet

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Reports',
filePattern: 'monthly_report.xlsx',
fileType: 'excel',
sheetName: 'January',
hasHeaderRow: true,
headerRowIndex: 1
}
});

7. Complete Example - Daily Import Process

// Daily customer data import with validation
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Daily',
filePattern: 'customers_*.csv',
selectColumns: ['CustomerID', 'Name', 'Email', 'Phone', 'AccountBalance'],
searchColumn: 'AccountBalance',
searchValue: '', // Get all rows (no filter)
hasHeaderRow: true
}
});

if (result.IsSuccessful) {
var customers = result.Data.rows;

// Process each customer
for (var i = 0; i < customers.length; i++) {
var customer = customers[i];

// Validate and insert
if (customer.Email && customer.Phone) {
var insertResult = doCmd('DoSqlCommand', {
Data: {
sqlQuery: `INSERT INTO Customers (CustomerId, Name, Email, Phone, Balance)
VALUES (@customerId, @name, @email, @phone, @balance)`,
parameters: {
customerId: customer.CustomerID,
name: customer.Name,
email: customer.Email,
phone: customer.Phone,
balance: parseFloat(customer.AccountBalance)
}
}
});
}
}

console.log('Imported ' + customers.length + ' customers from ' + result.Data.fileName);
}

Parameters

Data Object

ParameterTypeRequiredDefaultDescription
folderPathstring✅ Yes-Full path to folder containing files (e.g., C:\\Data\\Imports)
filePatternstringNo*.*File pattern to match (e.g., *.csv, Sales_*.xlsx, data_2025*.csv)
fileTypestringNoautoFile type: auto, csv, or excel. Auto-detects from extension
selectColumnsarrayNonullArray of column names to select (e.g., ['ID', 'Name', 'Amount'])
columnMappingobjectNonullMap to rename columns (e.g., {OldName: 'NewName', ID: 'CustomerId'})
searchColumnstringNonullColumn name to search/filter on
searchValuestringNonullValue to search for (case-insensitive contains)
skipRowsnumberNo0Number of data rows to skip (for pagination)
takeRowsnumberNonullMaximum number of rows to return (for pagination)
sheetNamestringNoFirst sheetExcel only: specific sheet name to read
hasHeaderRowbooleanNotrueWhether first row contains column headers
headerRowIndexnumberNo1Excel only: which row contains headers (1-based)

Response

Success Response

{
"IsSuccessful": true,
"StatusCode": "00",
"Message": "Successfully loaded 250 row(s) from customers_2025-01-14.csv",
"Data": {
"fileName": "customers_2025-01-14.csv",
"filePath": "C:\\Data\\Imports\\customers_2025-01-14.csv",
"fileType": "csv",
"rowCount": 250,
"rows": [
{
"CustomerID": "CUST001",
"Name": "John Doe",
"Email": "john@example.com",
"Phone": "+2348012345678",
"AccountBalance": "50000"
},
{
"CustomerID": "CUST002",
"Name": "Jane Smith",
"Email": "jane@example.com",
"Phone": "+2348087654321",
"AccountBalance": "75000"
}
// ... more rows
]
}
}

Error Response

{
"IsSuccessful": false,
"StatusCode": "99",
"Message": "Folder not found: C:\\Data\\NonExistent"
}

How Column Selection Works

Step 1: Load All Columns (Default)

If you don't specify selectColumns, all columns are loaded:

// Input file: ID, Name, Email, Phone, Address, City, State, ZipCode
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data',
filePattern: 'contacts.csv'
}
});
// Output: All 8 columns returned

Step 2: Select Specific Columns

Use selectColumns to pick only the columns you need:

// Pick only 3 columns
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data',
filePattern: 'contacts.csv',
selectColumns: ['ID', 'Name', 'Email']
}
});
// Output: Only ID, Name, Email columns (Address, Phone, etc. excluded)

Step 3: Select + Rename

Combine selectColumns with columnMapping to pick and rename:

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data',
filePattern: 'contacts.csv',
selectColumns: ['ID', 'Name', 'Email'], // Pick these
columnMapping: { // Then rename
ID: 'contactId',
Name: 'fullName',
Email: 'emailAddress'
}
}
});
// Output: contactId, fullName, emailAddress

Step 4: Select + Filter

Pick columns and filter rows:

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data',
filePattern: 'orders.csv',
selectColumns: ['OrderID', 'CustomerName', 'TotalAmount', 'Status'],
searchColumn: 'Status',
searchValue: 'Completed'
}
});
// Output: Only 4 columns, only rows where Status contains 'Completed'

Common Use Cases

Use Case 1: Daily Customer Import

// Import new customers from CSV file dropped daily
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Daily\\Customers',
filePattern: 'customers_*.csv',
selectColumns: ['CustomerID', 'Name', 'Email', 'Phone', 'AccountType']
}
});

if (result.IsSuccessful) {
var customers = result.Data.rows;

for (var i = 0; i < customers.length; i++) {
// Insert each customer into database
doCmd('DoSqlCommand', {
Data: {
sqlQuery: 'INSERT INTO Customers (CustomerId, Name, Email, Phone, AccountType) VALUES (@id, @name, @email, @phone, @type)',
parameters: customers[i]
}
});
}
}

Use Case 2: Transaction Reconciliation

// Load bank transactions and reconcile
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\BankFiles',
filePattern: 'transactions_*.csv',
selectColumns: ['TransactionID', 'AccountNumber', 'Amount', 'Date', 'Type'],
searchColumn: 'Type',
searchValue: 'Credit' // Only credit transactions
}
});

var credits = result.Data.rows;

for (var i = 0; i < credits.length; i++) {
var txn = credits[i];

// Check if transaction exists
var existing = doCmd('DoSqlQuery', {
Data: {
sqlQuery: 'SELECT Id FROM Transactions WHERE TransactionId = @txnId',
parameters: { txnId: txn.TransactionID }
}
});

if (existing.Data.Count === 0) {
// New transaction - process it
doCmd('ProcessCreditTransaction', { Data: txn });
}
}

Use Case 3: Monthly Report Processing

// Process Excel report with multiple sheets
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Reports\\Monthly',
filePattern: 'sales_report.xlsx',
fileType: 'excel',
sheetName: 'Summary',
selectColumns: ['Region', 'TotalSales', 'Target', 'Achievement']
}
});

var summary = result.Data.rows;

// Calculate totals
var totalSales = 0;
for (var i = 0; i < summary.length; i++) {
totalSales += parseFloat(summary[i].TotalSales);
}

console.log('Total company sales: ' + totalSales);

Use Case 4: Large File Batch Processing

// Process large file in batches of 1000 rows
var batchSize = 1000;
var currentBatch = 0;
var hasMoreRows = true;

while (hasMoreRows) {
var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\BigFiles',
filePattern: 'huge_file.csv',
skipRows: currentBatch * batchSize,
takeRows: batchSize
}
});

if (result.IsSuccessful && result.Data.rowCount > 0) {
var batch = result.Data.rows;

// Process this batch
for (var i = 0; i < batch.length; i++) {
// Process each row
processRow(batch[i]);
}

currentBatch++;
hasMoreRows = (result.Data.rowCount === batchSize);
} else {
hasMoreRows = false;
}
}

Best Practices

✅ Do's

  1. Use CSV for Large Files

    // Good: CSV streams data (low memory)
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'large_file.csv' // Can handle millions of rows
    }
    });
  2. Select Only Needed Columns

    // Good: Select only what you need
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'data.csv',
    selectColumns: ['ID', 'Name', 'Amount'] // Pick 3 out of 50 columns
    }
    });
  3. Use Pagination for Very Large Files

    // Good: Process in chunks
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'big.csv',
    skipRows: 0,
    takeRows: 5000 // Process 5000 rows at a time
    }
    });
  4. Filter Early to Reduce Data

    // Good: Filter at load time
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'transactions.csv',
    searchColumn: 'Status',
    searchValue: 'Active' // Only load active records
    }
    });

❌ Don'ts

  1. Don't Load Entire Large Excel File

    // Bad: Excel loads entire file into memory
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'huge_data.xlsx' // ❌ May cause memory issues
    }
    });
  2. Don't Load All Columns When You Need Few

    // Bad: Loading all 100 columns
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'wide_data.csv' // ❌ 100 columns when you need 3
    }
    });

    // Good: Select only what you need
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'wide_data.csv',
    selectColumns: ['ID', 'Name', 'Amount'] // ✅ Only 3 columns
    }
    });
  3. Don't Process Entire Large File at Once

    // Bad: Loading millions of rows
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'huge.csv' // ❌ Returns millions of rows
    }
    });

    // Good: Use pagination
    var result = doCmd('LoadDataFromFileCommand', {
    Data: {
    folderPath: 'C:\\Data',
    filePattern: 'huge.csv',
    takeRows: 10000 // ✅ Process in batches
    }
    });

Error Handling

var result = doCmd('LoadDataFromFileCommand', {
Data: {
folderPath: 'C:\\Data\\Imports',
filePattern: 'data.csv'
}
});

if (!result.IsSuccessful) {
// Handle errors
switch (result.StatusCode) {
case '99':
if (result.Message.includes('Folder not found')) {
console.error('Import folder does not exist');
// Create folder or alert admin
} else if (result.Message.includes('No files found')) {
console.warn('No files to process today');
// This might be expected
} else {
console.error('Import failed: ' + result.Message);
// Alert admin
}
break;
}
return false;
}

// Success - process data
var rows = result.Data.rows;
console.log('Loaded ' + rows.length + ' rows from ' + result.Data.fileName);

FAQ

Q: How do I select multiple specific columns?

A: Pass an array to selectColumns:

selectColumns: ['CustomerID', 'Name', 'Email', 'Balance']

Q: Can I select columns AND rename them?

A: Yes! Use both selectColumns (to pick) and columnMapping (to rename):

selectColumns: ['ID', 'Name'],
columnMapping: { ID: 'customerId', Name: 'fullName' }

Q: What if a column doesn't exist?

A: If you select a non-existent column, it will be returned as null in the result.

Q: Does column selection improve performance?

A: Yes! Selecting fewer columns:

  • Reduces memory usage
  • Speeds up data transfer
  • Makes processing faster
  • Especially important for wide files (many columns)

Q: Can I select columns by index instead of name?

A: No, column selection requires column names. If your file has no headers, set hasHeaderRow: false and columns will be named Column0, Column1, etc.

Q: How many rows can I load at once?

A:

  • CSV: Can stream millions of rows, but use takeRows to limit memory
  • Excel: Recommended <10,000 rows per load
  • Best practice: Use pagination for >5,000 rows

Q: What's the difference between selectColumns and columnMapping?

A:

  • selectColumns: Picks which columns to include (filters columns)
  • columnMapping: Renames columns that are already selected
  • They work together: First select, then rename

Q: Can I use wildcards in selectColumns?

A: No, you must specify exact column names. Use columnMapping: null to get all columns, then select in your code.

Technical Notes

  • Files are processed by last modified date (most recent first)
  • CSV uses UTF-8 encoding with streaming for memory efficiency
  • Excel uses ClosedXML library with in-memory processing
  • Column names are case-sensitive
  • Search filter uses case-insensitive contains matching
  • Only the first matching file is processed (by default)

Need Help? Contact the development team or check the Execution Engine documentation.