The Table File Reader processes Excel and CSV files, extracting structured data from spreadsheets with support for thousands of rows and multiple sheets. This reader handles both bulk data import and single-row processing with cell references across multiple sheets for use in automation workflows.

Key Features

Multi-Format Support

Process Excel (.xlsx, .xls) and CSV files with automatic format detection

High Volume Processing

Handle thousands of rows efficiently with batch processing capabilities

Multi-Sheet Support

Extract data from multiple Excel sheets with cross-sheet cell references

Flexible Data Mapping

Use column names or cell references for field mapping and data extraction

Supported File Types

The Table File Reader can process various spreadsheet formats:

Processing Modes

The Table File Reader supports two primary processing modes:

Bulk Data Processing

Process thousands of rows systematically using column names:
Use Case: Import large datasets where each row represents a recordConfiguration:
  • Header Row: Specify which row contains column names
  • Data Rows: Define the range of rows to process
  • Field Mapping: Map column names to field names
  • Data Types: Configure field types (Text, Number, Date, etc.)
Best for: Data imports, bulk updates, reporting data

Single Row Processing

Extract specific data using cell references across multiple sheets:
Use Case: Extract specific values from template-based spreadsheetsConfiguration:
  • Cell References: Use Excel notation (A1, B2, Sheet1!C3)
  • Sheet Selection: Specify which sheets to process
  • Field Names: Define custom field names for each cell
  • Data Types: Set appropriate types for each field
Best for: Forms, reports, template-based data extraction

Creating a Table File Reader

1

Navigate to File Readers

In your application, go to File Readers section
2

Create New Reader

Click + File Reader and select Table Data from the document type options
3

Configure Basic Settings

Name: Enter a descriptive name (e.g., “Customer Data Import”)Description: Optional description for your teamProcessing Mode: Choose between bulk processing or single-row processing
4

Set Up Field Mapping

For Bulk Processing: Map column names to field names For Single Row: Define cell references for each fieldConfigure field types and validation rules
5

Test with Sample File

Upload a sample Excel or CSV file to validate data extraction

Field Configuration

Column-Based Field Mapping

For bulk data processing using column names:
Configuration:
  • Column Name: “Customer_Name” → Field Name: “customer_name”
  • Column Name: “Email_Address” → Field Name: “email”
  • Column Name: “Purchase_Date” → Field Name: “purchase_date” (Date)
  • Column Name: “Amount” → Field Name: “amount” (Decimal)
Best for: Clean, well-structured data files

Cell Reference Configuration

For single-row processing using specific cell locations:
Configuration:
  • Customer Name: A2
  • Order Date: B2
  • Total Amount: C2
  • Status: D2
Best for: Simple forms or single-sheet reports

Using in Automations

Integration with Data Workflows

The Table File Reader integrates with automation workflows for comprehensive data processing:
File Upload → Table File Reader → Transform Data → Update Records → Generate Report

Common Automation Patterns

Batch Processing Workflows

Handle large datasets efficiently:
Large Excel File → Table File Reader → Repeat For Each (batch) → Transform Data → Update Records
Configuration:
  1. File Upload: Trigger on large file upload
  2. Table File Reader: Process in batches of 100 rows
  3. Repeat For Each: Iterate through batches
  4. Transform Data: Clean and validate each batch
  5. Update Records: Apply changes to database

Best Practices

File Structure

Use consistent column names and data formats across files for reliable processing

Performance Optimization

Process large files in batches to avoid timeouts and memory issues

Data Validation

Implement validation rules to ensure data quality before processing

Error Handling

Plan for missing data, format errors, and processing failures

Spreadsheet Preparation Tips

For Optimal Results:
  • Use consistent column headers across files
  • Avoid merged cells in data areas
  • Keep data types consistent within columns
  • Use standard date formats (YYYY-MM-DD)
  • Remove formatting that might interfere with processing
Performance Considerations:
  • Break large files into smaller chunks when possible
  • Use specific cell ranges rather than entire sheets
  • Minimize complex formulas in source files
  • Consider CSV format for very large datasets

Advanced Features

Multi-Sheet Processing

Handle complex workbooks with multiple related sheets:
1

Sheet Configuration

Define which sheets to process and their relationships
2

Cross-Sheet References

Use references like Sheet1!A1, Sheet2!B2 to combine data
3

Data Validation

Verify data consistency across sheets
4

Error Handling

Handle missing sheets or invalid references

Dynamic Column Detection

Automatically detect and process columns based on content:
File Upload → Table File Reader → AI Classification (column types) → Dynamic Field Mapping → Process Data
Benefits:
  • Handles varying file structures
  • Reduces manual configuration
  • Adapts to new data formats
  • Improves processing flexibility

Data Transformation Pipeline

Enhance extracted data with additional processing:

Error Handling and Troubleshooting

Common Issues

Validation Strategies

Implement validation checks to ensure data quality and processing reliability.
Validation Checklist:
  • Required fields are populated
  • Data types match expected formats
  • Numeric values are within valid ranges
  • Date formats are consistent
  • Text fields don’t exceed length limits

Performance Optimization

Processing Speed

Optimization Techniques:
  • Use specific cell ranges instead of entire sheets
  • Process in batches for large datasets
  • Choose appropriate field types
  • Minimize complex transformations
  • Use CSV format for maximum speed

Memory Management

Memory Optimization:
  • Process files in batches
  • Clear variables after processing
  • Limit concurrent processing
  • Monitor system resources
  • Use streaming for very large files

Integration Examples

Customer Data Import

CSV Upload → Table File Reader → Transform Data → Search Records → Update Customer Records
Configuration:
  1. Attachment Added: Trigger on CSV upload
  2. Table File Reader: Map columns to customer fields
  3. Transform Data: Clean phone numbers, standardize addresses
  4. Search Records: Find existing customers
  5. Update Record: Merge new data with existing records

Financial Report Processing

Excel Report → Table File Reader → Run Calculation → Update Financial Records → Generate Summary
Configuration:
  1. Email Received: Monthly report attachment
  2. Table File Reader: Extract key metrics from specific cells
  3. Run Calculation: Compute growth rates and trends
  4. Update Record: Store financial data
  5. Generate Report: Create executive summary

Multi-Sheet Workbook Analysis

Complex Workbook → Table File Reader → Process Multiple Sheets → Combine Data → Create Analysis
Configuration:
  1. File Upload: Multi-sheet workbook
  2. Table File Reader: Process each sheet with specific configuration
  3. Combine Data: Merge data from all sheets
  4. AI Classification: Categorize combined data
  5. Create Record: Generate comprehensive analysis

Comparison with Other File Readers

When to Use Table File Reader

Choose Table File Reader when:
  • Processing Excel or CSV files
  • Working with structured tabular data
  • Need to handle thousands of rows
  • Requiring multi-sheet processing
Consider alternatives when:

Performance Comparison

FeatureTable File ReaderText File ReaderPurchase Orders Reader
Speed⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐
Volume⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐
Flexibility⭐⭐⭐⭐⭐⭐⭐⭐⭐
Accuracy⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐⭐

Next Steps


The Table File Reader provides powerful capabilities for processing Excel and CSV files at scale. Use it for data imports, report processing, and any workflow requiring structured tabular data extraction.