Key Features
Multi-Format Support
Process Excel (.xlsx, .xls) and CSV files with automatic format detection
High Volume Processing
Handle thousands of rows efficiently with batch processing capabilities
Multi-Sheet Support
Extract data from multiple Excel sheets with cross-sheet cell references
Flexible Data Mapping
Use column names or cell references for field mapping and data extraction
Supported File Types
The Table File Reader can process various spreadsheet formats:Excel Formats
Excel Formats
- XLSX - Modern Excel format (Excel 2007+)
- XLS - Legacy Excel format (Excel 97-2003)
- Multi-sheet workbooks - Extract from specific sheets or all sheets
- Complex formulas - Processes calculated values
CSV Formats
CSV Formats
- CSV - Comma-separated values
- TSV - Tab-separated values
- Custom delimiters - Configure custom separators
- UTF-8 encoding - Full Unicode support
Data Structures
Data Structures
- Tabular data - Structured rows and columns
- Report formats - Header rows and summary data
- Template-based - Data in specific cell locations
- Mixed formats - Combination of structured and template data
Processing Modes
The Table File Reader supports two primary processing modes:Bulk Data Processing
Process thousands of rows systematically using column names:- Column-Based Processing
- Batch Processing
Use Case: Import large datasets where each row represents a recordConfiguration:
- Header Row: Specify which row contains column names
- Data Rows: Define the range of rows to process
- Field Mapping: Map column names to field names
- Data Types: Configure field types (Text, Number, Date, etc.)
Single Row Processing
Extract specific data using cell references across multiple sheets:- Cell Reference Processing
- Cross-Sheet References
Use Case: Extract specific values from template-based spreadsheetsConfiguration:
- Cell References: Use Excel notation (A1, B2, Sheet1!C3)
- Sheet Selection: Specify which sheets to process
- Field Names: Define custom field names for each cell
- Data Types: Set appropriate types for each field
Creating a Table File Reader
1
Navigate to File Readers
In your application, go to File Readers section
2
Create New Reader
Click + File Reader and select Table Data from the document type options
3
Configure Basic Settings
Name: Enter a descriptive name (e.g., “Customer Data Import”)Description: Optional description for your teamProcessing Mode: Choose between bulk processing or single-row processing
4
Set Up Field Mapping
For Bulk Processing: Map column names to field names
For Single Row: Define cell references for each fieldConfigure field types and validation rules
5
Test with Sample File
Upload a sample Excel or CSV file to validate data extraction
Field Configuration
Column-Based Field Mapping
For bulk data processing using column names:- Basic Column Mapping
- Advanced Column Processing
Configuration:
- Column Name: “Customer_Name” → Field Name: “customer_name”
- Column Name: “Email_Address” → Field Name: “email”
- Column Name: “Purchase_Date” → Field Name: “purchase_date” (Date)
- Column Name: “Amount” → Field Name: “amount” (Decimal)
Cell Reference Configuration
For single-row processing using specific cell locations:- Single Sheet References
- Multi-Sheet References
Configuration:
- Customer Name: A2
- Order Date: B2
- Total Amount: C2
- Status: D2
Using in Automations
Integration with Data Workflows
The Table File Reader integrates with automation workflows for comprehensive data processing:Common Automation Patterns
Bulk Data Import
Bulk Data Import
Trigger: Attachment Added (Excel file)
File Reader: Process all rows using column mapping
Actions:
- Repeat For Each row
- Transform Data to clean values
- Search Records to find existing entries
- Create Record or Update Record based on search results
- Generate Report with import summary
Report Processing
Report Processing
Trigger: Email Received (with report attachment)
File Reader: Extract key metrics using cell references
Actions:
- Run Calculation to compute derived values
- Update Record Fields with report data
- AI Classification to categorize report type
- Send Email Notification with summary
Template-Based Processing
Template-Based Processing
Trigger: Record Created (form submission)
File Reader: Extract data from uploaded template
Actions:
- Transform Data to standardize formats
- AI Classification to validate data quality
- Create Record with extracted information
- Start Approval Process if required
Batch Processing Workflows
Handle large datasets efficiently:- File Upload: Trigger on large file upload
- Table File Reader: Process in batches of 100 rows
- Repeat For Each: Iterate through batches
- Transform Data: Clean and validate each batch
- Update Records: Apply changes to database
Best Practices
File Structure
Use consistent column names and data formats across files for reliable processing
Performance Optimization
Process large files in batches to avoid timeouts and memory issues
Data Validation
Implement validation rules to ensure data quality before processing
Error Handling
Plan for missing data, format errors, and processing failures
Spreadsheet Preparation Tips
For Optimal Results:- Use consistent column headers across files
- Avoid merged cells in data areas
- Keep data types consistent within columns
- Use standard date formats (YYYY-MM-DD)
- Remove formatting that might interfere with processing
- Break large files into smaller chunks when possible
- Use specific cell ranges rather than entire sheets
- Minimize complex formulas in source files
- Consider CSV format for very large datasets
Advanced Features
Multi-Sheet Processing
Handle complex workbooks with multiple related sheets:1
Sheet Configuration
Define which sheets to process and their relationships
2
Cross-Sheet References
Use references like Sheet1!A1, Sheet2!B2 to combine data
3
Data Validation
Verify data consistency across sheets
4
Error Handling
Handle missing sheets or invalid references
Dynamic Column Detection
Automatically detect and process columns based on content:- Handles varying file structures
- Reduces manual configuration
- Adapts to new data formats
- Improves processing flexibility
Data Transformation Pipeline
Enhance extracted data with additional processing:Data Cleaning
Data Cleaning
Transform Data: Clean and standardize extracted values
AI Classification: Detect and categorize data types
Run Calculation: Compute derived values
Set Variable: Store processed data for later use
Data Validation
Data Validation
IF Conditions: Validate data meets requirements
Search Records: Check for duplicates or existing entries
AI Classification: Assess data quality and completeness
Post Comment: Log validation results
Error Handling and Troubleshooting
Common Issues
Column Mapping Errors
Column Mapping Errors
Symptoms: Fields return empty values or incorrect dataCauses:
- Column names don’t match configuration
- Header row in wrong location
- Data types incompatible
- Verify column names in source file
- Check header row configuration
- Adjust field types to match data
- Use Transform Data to clean values
Cell Reference Issues
Cell Reference Issues
Symptoms: Cell references return errors or empty valuesCauses:
- Sheet names changed
- Cell locations moved
- Referenced cells are empty
- Verify sheet names and structure
- Update cell references
- Add IF conditions to handle empty cells
- Use named ranges for stability
Large File Processing
Large File Processing
Symptoms: Processing timeouts or memory errorsCauses:
- File too large for single processing
- Complex formulas slow processing
- Memory limitations
- Enable batch processing
- Split large files into smaller chunks
- Use CSV format for very large datasets
- Process during off-peak hours
Validation Strategies
Implement validation checks to ensure data quality and processing reliability.
- Required fields are populated
- Data types match expected formats
- Numeric values are within valid ranges
- Date formats are consistent
- Text fields don’t exceed length limits
Performance Optimization
Processing Speed
Optimization Techniques:- Use specific cell ranges instead of entire sheets
- Process in batches for large datasets
- Choose appropriate field types
- Minimize complex transformations
- Use CSV format for maximum speed
Memory Management
Memory Optimization:- Process files in batches
- Clear variables after processing
- Limit concurrent processing
- Monitor system resources
- Use streaming for very large files
Integration Examples
Customer Data Import
- Attachment Added: Trigger on CSV upload
- Table File Reader: Map columns to customer fields
- Transform Data: Clean phone numbers, standardize addresses
- Search Records: Find existing customers
- Update Record: Merge new data with existing records
Financial Report Processing
- Email Received: Monthly report attachment
- Table File Reader: Extract key metrics from specific cells
- Run Calculation: Compute growth rates and trends
- Update Record: Store financial data
- Generate Report: Create executive summary
Multi-Sheet Workbook Analysis
- File Upload: Multi-sheet workbook
- Table File Reader: Process each sheet with specific configuration
- Combine Data: Merge data from all sheets
- AI Classification: Categorize combined data
- Create Record: Generate comprehensive analysis
Comparison with Other File Readers
When to Use Table File Reader
Choose Table File Reader when:- Processing Excel or CSV files
- Working with structured tabular data
- Need to handle thousands of rows
- Requiring multi-sheet processing
- Processing unstructured documents (Text File Reader)
- Working with business forms (Purchase Orders Reader)
- Requiring AI-powered analysis (Elementum Intelligence Reader)
- Processing JSON data (JSON File Reader)
Performance Comparison
Feature | Table File Reader | Text File Reader | Purchase Orders Reader |
---|---|---|---|
Speed | ⭐⭐⭐⭐⭐ | ⭐⭐ | ⭐⭐⭐⭐ |
Volume | ⭐⭐⭐⭐⭐ | ⭐⭐ | ⭐⭐⭐ |
Flexibility | ⭐⭐⭐⭐ | ⭐⭐ | ⭐⭐⭐ |
Accuracy | ⭐⭐⭐⭐⭐ | ⭐⭐⭐ | ⭐⭐⭐⭐⭐ |
Next Steps
Automation System
Learn how to integrate Table File Readers with data processing workflows
Data Mining
Explore advanced data processing and analysis capabilities
Calculations
Perform calculations on extracted spreadsheet data
Analytics
Create analytics and reports from processed table data
The Table File Reader provides powerful capabilities for processing Excel and CSV files at scale. Use it for data imports, report processing, and any workflow requiring structured tabular data extraction.