Skip to main content

Data Table Trigger

The Data Table Trigger fires a workflow whenever a row is added, updated, or removed from a given Data Table.

Data Table Trigger

Node Properties

There are two configurable properties for the Data Table Trigger ...

Data Table

First, select which Data Table the trigger should fire on when one or more of the selected actions is applied to the table.

Trigger Actions

Next, choose one or more actions applied to the Data Table that should fire the trigger. At least one action is required. You may fire the trigger on the following actions:

Trigger on a single row insertion

The trigger will fire whenever a single row is added to the Data Table ...

  • By clicking the "Add Row" button in the table's interface.
  • By importing a CSV with a single row in the table's interface.
  • By invoking a Table: Insert Rows Node with just a single new row ...
    • Through the "Individual Fields" option.
    • Through the "Payload Path" or "JSON Template" option, providing either an object for the new row or an array containing a single new row object.
  • Through the Data Table Rows: Post API endpoint with a single new row (either as an object or as an array with a single entry).

Trigger on a bulk row insertion

The trigger will fire whenever multiple rows are added to the Data Table ...

Note: When importing a CSV, the Losant interface may break the contents of larger files up into multiple requests. This will result in the Data Table Trigger firing multiple times for a single file import.

Trigger on a single row update

The trigger will fire whenever updating one or more values in a Data Table row ...

Note: The trigger will not fire if the updates applied to the row do not result in any value changes (for example, setting one column's value to "hello" when the value in the row is already "hello").

Trigger on a single row deletion

The trigger will fire a single time whenever deleting a Data Table row ...

The trigger will fire multiple times (once per row) in the following scenarios ...

  • By providing a query that does not automatically match all rows and then selecting the "Delete Rows" option through the table's interface.
  • By invoking a Table: Delete Rows Node with a "Query Template" that matches multiple rows and when providing a "Limit Template" greater than 1 in the configuration.
  • Through the Data Table Rows: Delete API endpoint with a query that resolves to multiple rows and when providing a "limit" greater than 1 in the request.

Note: The trigger will fire a maximum of 10,000 times in the scenarios described above.

The trigger will not fire in the following scenarios ...

  • By providing a query that automatically matches all rows and then selecting the "Delete Rows" option through the table's interface. (The confirmation modal will include this notice that Data Table Triggers will not fire for these types of queries.)
  • Through the Data Table Rows: Truncate API endpoint.

Payload

Depending on the action that caused the trigger to fire, the shape of the workflow's initial payload changes significantly.

Single Row Insertion

When inserting a single row, the payload includes the details about the new row under the data property ...

  • action: The action that caused the trigger to fire, which will be insert.
  • newRow: An object containing the new row's column values, including:
    • id: The ID of the newly created row.
    • createdAt: A Date object for when the row was created.
    • updatedAt: A Date object for when the row was last updated, which will match the createdAt value for a newly created row.
    • Other keys and values for each column in the new row.

For example, given a new row with the following column values ...

{
"name": "Losant",
"type": "IoT Platform"
}

... a Data Table Trigger workflow payload will look like the following:

  • {} 12 keys
    • "555555555555eeeeeeeeeeee"
    • "My Great Application"
    • {} 2 keys
      • "insert"
      • {} 5 keys
      • "333333333333cccccccccccc"
      • "My Great Workflow"
      • "myFlowVersion"
      • {} 3 keys
        • "<ID of the entity that caused the workflow to fire>"
        • "<type of entity that caused the workflow to fire>"
        • Tue Aug 19 2025 15:16:17 GMT+0000 (Coordinated Universal Time) (Date object)
        • "<data table ID>"
        • "dataTable"

      Bulk Row Insertion

      When inserting multiple rows, the new rows themselves are not available on the payload; however, IDs of each added row are included, as well as some statistics about the operation, under the data property:

      • action: The action that caused the trigger to fire, which will be bulkInsert.
      • count: The number of rows that were added.
      • errorCount: The number of rows that failed to add as part of the operation.
      • rowIds: An array of IDs for each added row.

      For example, given a bulk insert of three rows, with one row failing to add, a Data Table Trigger workflow payload will look like the following:

      • {} 12 keys
        • "555555555555eeeeeeeeeeee"
        • "My Great Application"
        • {} 4 keys
          • "bulkInsert"
          • 3
          • 1
          • [] 3 items
          • "333333333333cccccccccccc"
          • "My Great Workflow"
          • "myFlowVersion"
          • {} 3 keys
            • "<ID of the entity that caused the workflow to fire>"
            • "<type of entity that caused the workflow to fire>"
            • Tue Aug 19 2025 15:16:17 GMT+0000 (Coordinated Universal Time) (Date object)
            • "<data table ID>"
            • "dataTable"

          Row Update

          When updating a row, the row's old column values and new column values are both included on the payload under the data property:

          • action: The action that caused the trigger to fire, which will be update.
          • newRow: An object containing the new row's column values, including:
            • id: The ID of the updated row.
            • createdAt: A Date object for when the row was created.
            • updatedAt: A Date object for when the row was last updated, which will match the current time of the update.
            • Other keys and new values for each column in the row.
          • oldRow: An object containing the previous row's column values, including:
            • id: The ID of the updated row.
            • createdAt: A Date object for when the row was created.
            • updatedAt: A Date object for when the row was last updated, which will match the previous time of the update.
            • Other keys and old values for each column in the row.

          For example, given a row with the following column values before the update ...

          {
          "name": "Losant",
          "type": "A Good IoT Platform"
          }

          ... and then updated to the following column values ...

          {
          "name": "Losant",
          "type": "A GREAT IoT Platform",
          "status": "Active"
          }

          ... a Data Table Trigger workflow payload will look like the following:

          • {} 12 keys
            • "555555555555eeeeeeeeeeee"
            • "My Great Application"
            • {} 3 keys
              • "update"
              • {} 6 keys
                • {} 5 keys
                • "333333333333cccccccccccc"
                • "My Great Workflow"
                • "myFlowVersion"
                • {} 3 keys
                  • "<ID of the entity that caused the workflow to fire>"
                  • "<type of entity that caused the workflow to fire>"
                  • Tue Aug 19 2025 15:16:17 GMT+0000 (Coordinated Universal Time) (Date object)
                  • "<data table ID>"
                  • "dataTable"

                Row Deletion

                When deleting a row, the previous row values are included in the initial payload under the data property:

                • action: The action that caused the trigger to fire, which will be delete.
                • oldRow: An object containing the previous row's column values, including:
                  • id: The ID of the deleted row.
                  • createdAt: A Date object for when the row was created.
                  • updatedAt: A Date object for when the row was last updated.
                  • Other keys and values for each column in the deleted row.

                For example, given a row with the following column values before the deletion ...

                {
                "name": "Losant",
                "type": "A Good IoT Platform"
                }

                ... a Data Table Trigger workflow payload will look like the following:

                • {} 12 keys
                  • "555555555555eeeeeeeeeeee"
                  • "My Great Application"
                  • {} 2 keys
                    • "delete"
                    • {} 5 keys
                    • "333333333333cccccccccccc"
                    • "My Great Workflow"
                    • "myFlowVersion"
                    • {} 3 keys
                      • "<ID of the entity that caused the workflow to fire>"
                      • "<type of entity that caused the workflow to fire>"
                      • Tue Aug 19 2025 15:16:17 GMT+0000 (Coordinated Universal Time) (Date object)
                      • "<data table ID>"
                      • "dataTable"

                    Was this page helpful?


                    Still looking for help? You can also search the Losant Forums or submit your question there.