FileMaker Looping Scripts: Building Resilient and Reliable Data Processing
Introduction
FileMaker scripts often fail at scale because they remain tied too closely to the user interface. You might build a routine to update inventory and test it successfully on a few dozen records. When a user runs that same process on five thousand records, the result is often a frozen screen or a spinning cursor.
The operating system stops receiving updates from the application and marks it as "Not Responding." Users naturally assume the software has crashed and force-quit FileMaker. This interrupts the transaction and leaves your database in an unknown state with only half the records updated.
Reliable data processing requires separating the logic from the presentation. Techniques that work fine for small batches often cause performance bottlenecks or crashes when the record count grows. This article outlines how to design looping scripts that function reliably regardless of the dataset size.
The Enemy Is the Interface
The single biggest bottleneck in FileMaker scripting is the User Interface (UI). When a script runs on a layout, FileMaker incurs a massive "context weight." For every record you touch, the engine has to check conditional formatting, evaluate visibility calculations, load portal data, and render graphics.
If you are looping through 10,000 records on a complex Dashboard layout, you are asking FileMaker to render that dashboard 10,000 times. Freeze Window helps, but it does not eliminate the overhead of loading the record's schema and related data.
To build resilient loops, you must separate your processing logic from your presentation layer.
Strategy 1: The Dedicated Utility Layout
The first step to professional looping is never to process data on a user-facing layout. Instead, create "Utility" layouts for your tables.
A Utility layout should have:
- No triggers: Ensure no script triggers fire on record load or commit.
- No fields (or minimal fields): Only place the fields strictly necessary for the operation.
- No styling: It does not need a theme.
- No portals: Unrelated data should not be fetched.
When you run your loop on this stripped-down layout, FileMaker only fetches the raw data it needs. The performance difference is often an order of magnitude. A loop that takes 10 minutes on a Form View might take 30 seconds on a Utility layout.
Strategy 2: The "Loop-Off-Edge" Pattern
Many developers start by looping through portals. This is risky. Portals are fragile interface objects. If another developer changes the sort order of the portal or adds a filter, your script might break or process the wrong records.
The standard for professional development is the Loop-Off-Edge pattern on a list view (or utility layout). This relies on the Go to Record/Request/Page [ Next ; Exit after last: On ] step.
However, isolating the specific records you want to process requires a specific technique. FileMaker Finds cannot natively search for a return-separated list of IDs in a single field. To solve this, we use the GTRR Multikey method.
The latest version of FileMaker has a Go to List of Records and GetRecordIDsFromFoundSet functions that make this even easier today. Read more about it on the article I wrote Here.
Prerequisite Schema
To make this work, you need a relationship built for this specific purpose:
- Global Field: Create a global text field (e.g., Globals::g_Processing_IDs).
- Relationship: Link Globals::g_Processing_IDs to the Primary Key of your target table (e.g., Invoices::ID).
Here is the robust structure:
# ---------------------------------------------------
# Setup: Load the IDs into a Global Field
# ---------------------------------------------------
# Assume $idList contains a return-separated list of UUIDs to process
Set Field [ Globals::g_Processing_IDs ; $idList ]
# ---------------------------------------------------
# Isolate records using GTRR
# ---------------------------------------------------
# This step grabs all records matching the IDs in the global field
# and opens them immediately in the Utility window.
Go to Related Record [
From table: "Invoices_Utility_Relation" ;
Using layout: "UTILITY_Invoice" ;
New Window: Style: Card
]
# CRITICAL: Check if GTRR failed (e.g., no records found)
If [ Get ( LastError ) ≠ 0 ]
Show Custom Dialog [ "Error" ; "No records could be isolated for processing." ]
Exit Script [ Text Result: 0 ]
End If
# ---------------------------------------------------
# The Loop
# ---------------------------------------------------
Go to Record/Request/Page [ First ]
Loop
# 1. Set Error Capture
Set Error Capture [ On ]
# 2. Perform the logic
# Example: Update status and timestamp
Set Field [ Invoices::Status ; "Processed" ]
Set Field [ Invoices::Timestamp_Process ; Get ( CurrentTimestamp ) ]
# 3. Check for errors (Resilience)
If [ Get ( LastError ) ≠ 0 ]
# Log this error to a variable or table
Perform Script [ "Log Error" ; Parameter: Invoices::ID ]
End If
# 4. Move to next
Go to Record/Request/Page [ Next ; Exit after last: On ]
End Loop
# ---------------------------------------------------
# Cleanup
# ---------------------------------------------------
# Clear the global to free up memory
Set Field [ Globals::g_Processing_IDs ; "" ]
Close Window [ Name: "Processing" ; Current file ]
Strategy 3: Providing User Feedback (The "Heartbeat")
When processing thousands of records, Freeze Window can be too aggressive. If the script runs for more than a few seconds without a screen update, the operating system thinks the application has hung.
To prevent this, you need a "heartbeat" mechanism. This updates the UI periodically to let the user (and the OS) know that the script is still alive.
We can use the Mod function to update the screen only every X records.
# Inside your Loop
Set Variable [ $count ; Value: $count + 1 ]
# Update UI every 50 records
If [ Mod ( $count ; 50 ) = 0 ]
Set Window Title [ Current Window ; Name: "Processing: " & $count & " completed..." ]
# Force a quick redraw of the specific area if needed
Refresh Window [ Check cached join results: Off ]
End If
Using Set Window Title is a lightweight way to provide feedback. If you are using a Card Window with a progress bar, you can update a Global Variable and refresh just the progress bar object.
Crucial Advice: Do not refresh the whole window or update the UI on every record. That defeats the purpose of your optimization.
Strategy 4: Offloading to the Server
As your datasets grow into the tens of thousands, the most resilient pattern is often moving the workload off the user's computer entirely.
FileMaker allows you to execute scripts directly on the host machine using the Perform Script on Server (PSoS) script step. This approach offers significant advantages over local looping.
First, it eliminates network latency. A local script must drag data across the network wire to the client, process it, and push the changes back. A server-side script executes where the data lives. This proximity alone often results in performance gains of 10x or more.
Second, the server environment allows you to utilize hardware more effectively. While a standard client session is single-threaded, a server can handle multiple simultaneous script sessions. For truly massive datasets, you can split your list of IDs into chunks and spawn multiple PSoS sessions at once. This parallel processing allows you to engage multiple cores on the server CPU to complete hours of work in minutes.
Managing Server Resources
Even when running on the server, you must respect memory limits. The Batch Processing concept discussed earlier remains vital.
If you attempt to modify 50,000 records in a single pass without saving, you risk exhausting the memory allocated to the scripting engine. The safest approach involves a hybrid strategy:
- Run on Server: Initiate the process via PSoS to keep the client UI responsive.
- Batch the Commits: Inside the server script, process records in chunks of 500 or 1,000.
- Explicit Saves: Use Commit Records/Requests after each chunk to flush the cache and release record locks.
By combining server-side execution with disciplined batching, you create a system that is fast for the user and healthy for the database engine.
When to Use "Replace Field Contents"
You might ask, "Why not just use Replace Field Contents? It is faster."
You are right; it is faster. However, Replace Field Contents is a blunt instrument.
- It cannot perform complex logic outside of setting a field (e.g., "If status is X, do Y, else do Z").
- It cannot handle errors on a per-record basis. If record #500 is locked by another user, Replace Field Contents throws an error and often stops or skips without giving you a detailed log of what failed.
Use Replace Field Contents when:
- You are making a simple, identical change to all records.
- You have exclusive access to the system (e.g., nightly maintenance scripts).
Use Looping Scripts when:
- You have conditional logic (If/Else).
- You need to update related records (parent/child data).
- You need to generate an error log for specific failures.
Summary Checklist
To make your loops resilient:
- Escape the Interface: Open a new window using a Utility layout.
- Isolate the Context: Perform a fresh Find for the records you need to process.
- Use Loop-Off-Edge: Iterate using Go to Record [Next].
- Add a Heartbeat: Update the UI every 50 or 100 records to prevent "Application Not Responding" errors.
- Clean Up: Always close your processing windows and return the user to their original state.
By moving your processing logic away from the user interface, you create applications that feel faster, scale better, and protect your data integrity.