This is the full developer documentation for IdentityFlow # Event-Sourced Architecture > Understand how IdentityFlow uses event sourcing for reliability and auditability. Event-Sourced Architecture Every state change is recorded as an immutable event, enabling replay and audit. Every state change in your workflow is recorded as an immutable event: ```typescript export default defineWorkflow('document approval', async (flow) => { // Each action is recorded as an event await flow.do('submit document', async () => { await submitDocument(flow.params); // Event: DocumentSubmitted }); // State changes are derived from events await flow.do('request approval', async () => { await requestApproval(flow.params); // Event: ApprovalRequested }); // Events can be replayed to reconstruct state await flow.do('process approval', async () => { await processApproval(flow.params); // Event: ApprovalProcessed }); }); ``` **Key Benefits:** * **Full Auditability**: A complete, unchangeable history of every action and state transition. * **State Reconstruction**: Replay events to understand the state of a workflow at any point in time. * **Debugging and Diagnostics**: Easily trace what happened leading up to an issue. * **Temporal Queries**: Analyze process evolution over time. This foundational pattern ensures that IdentityFlow workflows are transparent, traceable, and resilient. [Learn more about advanced concepts in our Deep Dive section.](../../deep-dive/) # Deterministic Execution > Learn how IdentityFlow ensures workflows behave predictably every time. Deterministic Execution Given the same input, workflows always follow the same execution path. IdentityFlow is designed to ensure that workflow execution is deterministic. This means that for a given workflow definition and a specific set of input parameters, the workflow will always: * Execute the same steps in the same order. * Produce the same intermediate state changes. * Arrive at the same final outcome. **Why is Determinism Important?** * **Predictability**: You can rely on your workflows to behave consistently, making them easier to understand, test, and manage. * **Testability**: Deterministic workflows are highly testable. You can write tests that verify specific execution paths and outcomes with confidence. * **Reliability**: It eliminates a class of bugs related to inconsistent behavior or race conditions that can plague non-deterministic systems. * **Replayability**: In conjunction with event sourcing, determinism allows for accurate replay of workflow history for auditing or debugging. This core principle contributes significantly to the robustness and trustworthiness of IdentityFlow. [Explore our Quick Start Guide to build your first workflow.](../../guides/20-quickstart/) # Simple, Intuitive API > Discover how IdentityFlow's API makes workflow definition straightforward. IdentityFlow provides a clean and developer-friendly API for defining and managing workflows using familiar TypeScript/JavaScript patterns. ```typescript import { defineWorkflow } from '@identity-flow/sdk'; // This binding helper facilitates creating typed GraphQL clients // See @identity-flow/binding-graphql and gql.tada documentation import { createGraphqlClient } from '@identity-flow/sdk/bindings/graphql'; // Define a stable binding function for lazy injection const GraphqlClient = () => createGraphqlClient(); export default defineWorkflow( { name: 'user-approval', // Name moved into config object }, async (flow) => { // Option 1: Lazy-load and cache the GraphQL client const client = flow.use(GraphqlClient); await flow.do('fetch-data-1', async () => { const data = await client.request('{ user { id } }'); // Example query }); // Options 2: Directly use the GraphQL client via task context await flow.do('fetch-data-2', async ({ use }) => { const data = await use(GraphqlClient).request('{ user { name } }'); // Example query }); }, ); ``` **Key API Design Principles:** * **Code-First**: Define workflows directly in your codebase using TypeScript or JavaScript. * **Declarative Style**: Clearly express workflow logic with helper functions like `flow.do()`, `flow.sleep()`, `flow.use()`. * **Lazy Loading**: Efficiently manage external service integrations (like GraphQL clients) with `flow.use()`. * **Async/Await**: Leverage modern JavaScript features for clean asynchronous code. This approach empowers developers to build complex automations without a steep learning curve, integrating seamlessly with existing development practices. [See our Workflow Rules for best practices.](../../deep-dive/05-workflow-rules/) # Type-Safe Development > Leverage TypeScript for robust and maintainable workflow definitions. IdentityFlow is designed with TypeScript at its core, enabling you to build robust, maintainable, and type-safe workflows. ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; // Example using built-in Valibot // Define types for parameters and results interface OrderParams { orderId: string; items: Array<{ id: string; quantity: number }>; } interface ProcessResult { status: 'COMPLETED' | 'PENDING' | 'FAILED'; // Corrected status values reference: string; } // Use generics on the defineWorkflow function itself export default defineWorkflow( { name: 'process-order', // Optionally define input schema for automatic validation schema: v.object({ orderId: v.string(), items: v.array(v.object({ id: v.string(), quantity: v.number() })), }), }, async (flow) => { // Full type inference for flow.params based on schema or generic const { orderId, items } = flow.params; flow.log(`Processing order ${orderId}`); // Return type is enforced by the generic or schema const result: ProcessResult = await flow.do('process-order-step', async () => { const apiResult = await processOrder({ orderId, items }); // Call external logic // Map the result to the expected ProcessResult shape return { status: apiResult.status === 'success' ? 'COMPLETED' : 'FAILED', reference: apiResult.id, }; }); return result; }, ); ``` **Advantages of Type Safety:** * **Early Error Detection**: Catch type-related errors at compile time, long before they reach production. * **Improved Code Quality**: Encourages clearer, more explicit code and data structures. * **Enhanced Maintainability**: Makes refactoring and understanding existing workflows easier and safer. * **Better Developer Experience**: Autocompletion and type inference in your IDE improve productivity. * **Schema Validation**: Integrate with validation libraries like Valibot (shown), Zod, or Yup for runtime data validation based on your types. By embracing TypeScript, IdentityFlow helps you build more reliable and scalable automation solutions. [Read our guide on TypeScript Support for more details.](../../deep-dive/25-typescript-support/) # Resilient Asynchronous Flow > Learn how IdentityFlow handles asynchronous operations and errors robustly. IdentityFlow is built to manage long-running, asynchronous processes with resilience. It ensures that workflows can pause, wait for external events, handle errors, and resume reliably. * Basic Flow ```typescript export default defineWorkflow( { name: 'order-processing' }, async (flow) => { // Each step is tracked and can be retried await flow.do('validate order', async () => { await validateOrder(flow.params); }); // Run steps in parallel using Promise.all // Note: The workflow waits until all promises in Promise.all resolve. // Handling errors with Promise.allSettled or cancellation with Promise.race // requires careful consideration within the IdentityFlow model. await Promise.all([ flow.do('process payment', async () => { await processPayment(flow.params.payment); }), flow.do('check inventory', async () => { await checkInventory(flow.params.items); }), ]); // Handle conditional flows if (flow.params.requiresShipping) { await flow.do('arrange shipping', async () => { await arrangeShipping(flow.params); }); } }); ``` * Error Handling ```typescript export default defineWorkflow( { name: 'payment-processing', // Default retries for all steps in this workflow defaults: { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } } }, async (flow) => { try { await flow.do('process payment', async () => { await processPayment(flow.params); }); } catch (error) { flow.error('Payment failed', { error: error.message, code: error.code }); // Re-throwing fails the workflow after retries are exhausted throw error; } }); ``` **Core Resilience Features:** * **Stateful Execution**: Workflows maintain their state across asynchronous operations, ensuring they resume exactly where they left off. * **Built-in Retries**: Configure automatic retries for failed steps with customizable delays and backoff strategies. * **Explicit Error Handling**: Use standard `try...catch` blocks and `flow.error()` for robust error management and logging. * **Parallel Processing**: Safely execute multiple asynchronous tasks in parallel using `Promise.all`. These capabilities allow you to build complex, long-running automations that can withstand transient failures and operate reliably. [Dive deeper into Error Handling strategies.](../../deep-dive/10-error-handling/) ```plaintext ``` # Detailed Error Tracing > Quickly debug workflows with comprehensive error traces. When issues arise, IdentityFlow provides detailed error traces to help you quickly diagnose and resolve problems in your workflows. **Example Trace Log Format:** (Actual format may vary based on your logger configuration) ```plaintext [2025-02-11T22:02:53.211Z] › c6b9a8fd01498c146cbbcb565c91a062 │ └ +0.000s / 0.018s ✓ workflow.scheduler.processing • workflow.scheduler.processing.count: 1 • workflow.scheduler.scheduled.id: AZT3CVsve923tR4CjdPcfw • workflow.scheduler.scheduled.kind: step.timeout • workflow.instance.id: AZT3CVsLf7OSK6dzkBf0aA • workflow.step.id: AZT3CVsockidGYJPUSnu8Q • workflow.instance.status: active • workflow.definition.id: AZT3CVr4cdesZRJlu5O7SA • workflow.definition.name: 1344092612_0_2_1 ``` **Information Captured in Traces Often Includes:** * Timestamps for each event and step. * Unique identifiers for workflow instances, steps, and definitions. * Workflow status at various points. * Relevant context or error messages. * Correlation IDs for tracking across services (when using OpenTelemetry). **Benefits of Detailed Tracing:** * **Rapid Diagnosis**: Quickly understand the sequence of events leading to an error. * **Reduced Debugging Time**: Pinpoint the exact step and context where a failure occurred. * **Improved System Understanding**: Gain better insight into how your workflows are executing. Combined with OpenTelemetry integration, error tracing in IdentityFlow provides powerful tools for maintaining healthy and reliable automations. [Learn more about Error Handling in our Deep Dive.](../../deep-dive/10-error-handling/) # OpenTelemetry Integration > Explore IdentityFlow's built-in observability with OpenTelemetry. OpenTelemetry Integration Built-in observability with comprehensive tracing and metrics. IdentityFlow integrates seamlessly with OpenTelemetry, the industry standard for observability, providing deep insights into your workflow executions. * Basic Tracing ```typescript export default defineWorkflow('payment processing', async (flow) => { // Each step automatically creates a span await flow.do('validate payment', async () => { // Custom attributes are automatically added await validatePayment(flow.params); }); // Manual instrumentation with destructured span await flow.do('process transaction', async ({ span }) => { span.setAttribute('payment.amount', flow.params.amount); await processTransaction(flow.params); }); }); ``` * Metrics (Conceptual) ```typescript // Note: While the SDK re-exports OpenTelemetry API components, // collecting and exporting metrics requires integrating with an // OpenTelemetry collector and backend system separately. // See OpenTelemetry documentation for setup details. // Example placeholder - actual implementation depends on OTel setup import { metrics } from '@identity-flow/sdk/telemetry'; // Placeholder: Define a counter (requires OTel setup) /* const counter = metrics.createCounter('workflow_executions_total', { description: 'Total number of workflow executions' }); */ export default defineWorkflow( { name: 'monitored-process' }, async (flow) => { // Placeholder: Increment the counter (requires OTel setup) // counter.add(1, { workflow: flow.definition.name }); flow.log('Running monitored process...'); // ... workflow implementation }); ``` **Key Observability Benefits:** * **Distributed Tracing**: Understand the entire lifecycle of a workflow, even as it interacts with other services. * **Performance Monitoring**: Identify bottlenecks and optimize workflow performance with detailed span data. * **Error Diagnostics**: Quickly pinpoint where and why errors occur within your workflows. * **Metrics Collection**: (Requires separate OTel collector setup) Gather quantitative data about workflow executions, durations, and custom business metrics. This integration empowers you to monitor, debug, and optimize your automated processes effectively. [Learn more about Observability in our Deep Dive section.](../../deep-dive/40-observability/) ```plaintext ``` # Case Studies > Real-world examples of IdentityFlow in production E-commerce Learn how a major e-commerce platform processes thousands of orders per day with reliable, traceable execution. Fintech Discover how a fintech company ensures exactly-once execution for critical financial transactions. SaaS See how a SaaS company automated their complex user onboarding process while maintaining compliance. ## [E-commerce Order Processing](#e-commerce-order-processing) [Section titled “E-commerce Order Processing”](#e-commerce-order-processing) ### [Challenge](#challenge) [Section titled “Challenge”](#challenge) A major e-commerce platform needed to handle complex order fulfillment workflows with multiple steps, including inventory checks, payment processing, and shipping coordination. ### [Solution](#solution) [Section titled “Solution”](#solution) IdentityFlow’s event-sourced architecture provided: * Reliable order processing with exactly-once execution * Real-time visibility into order status * Automatic retry handling for failed steps * Complete audit trail for every order ### [Results](#results) [Section titled “Results”](#results) * 99.99% order processing reliability * 45% reduction in manual intervention * Full compliance with audit requirements ## [Financial Transaction Processing](#financial-transaction-processing) [Section titled “Financial Transaction Processing”](#financial-transaction-processing) ### [Challenge](#challenge-1) [Section titled “Challenge”](#challenge-1) A fintech company required absolute consistency in payment processing with zero room for duplicate transactions or lost updates. ### [Solution](#solution-1) [Section titled “Solution”](#solution-1) Implemented IdentityFlow with: * Idempotent transaction handling * Comprehensive audit logging * Real-time monitoring * Automated recovery procedures ### [Results](#results-1) [Section titled “Results”](#results-1) * Zero duplicate transactions * 100% transaction traceability * 60% faster issue resolution ## [User Onboarding Automation](#user-onboarding-automation) [Section titled “User Onboarding Automation”](#user-onboarding-automation) ### [Challenge](#challenge-2) [Section titled “Challenge”](#challenge-2) A SaaS provider wanted to streamline their user onboarding while maintaining security and compliance requirements. ### [Solution](#solution-2) [Section titled “Solution”](#solution-2) IdentityFlow enabled: * Automated verification workflows * Step-by-step progress tracking * Compliance documentation * Integration with existing systems ### [Results](#results-2) [Section titled “Results”](#results-2) * 70% reduction in onboarding time * Improved user experience * Full compliance maintenance ## [Share Your Success Story](#share-your-success-story) [Section titled “Share Your Success Story”](#share-your-success-story) Contact Us We’re always looking for new success stories. If you’ve implemented IdentityFlow in an interesting way, we’d love to hear from you. Contact us at # 404 > Houston, we have a problem. We couldn’t find that page.
Check the URL or try using the search bar. # Initial Release ## [Hello New](#hello) [Section titled “Hello ”New](#hello) Hello world! # Deep Dive > In-depth technical explanations of IdentityFlow concepts, features, and advanced usage patterns. This section provides in-depth technical explanations of IdentityFlow’s core concepts, internal workings, and advanced features. Explore these topics to master complex workflow orchestration and maximize the engine’s capabilities. ## [Core Technical Concepts](#core-technical-concepts) [Section titled “Core Technical Concepts”](#core-technical-concepts) [Workflow Rules ](./05-workflow-rules/) [Error Handling ](./10-error-handling/) [Data Validation ](./20-data-validation/) [Retry Policies ](./15-retry-policies/) [Integrations ](./30-integrations/) [Observability ](./40-observability/) [Performance ](./35-performance/) [TypeScript Support ](./25-typescript-support/) [API Reference ](./80-api-reference/) ## [Advanced Usage Patterns](#advanced-usage-patterns) [Section titled “Advanced Usage Patterns”](#advanced-usage-patterns) Explore techniques for more complex scenarios: [Parallel Processing ](./50-parallel-processing/) [Sub-Workflows ](./60-sub-workflows/) [Complex State Management ](./60-complex-state-management/) [Custom Step Types ](./70-custom-step-types/) [Dynamic Workflows ](./75-dynamic-workflows/) [Resource Management ](./55-resource-management/) # Rules of Workflows > Essential guidelines for building reliable, long-running workflows Idempotency Ensure API calls with side effects are idempotent to prevent duplicates. Granular Steps Break workflows into small, self-contained units of work. State Management Store state through step outputs, not in-memory variables. Event Immutability Never mutate incoming events, create new state instead. ## [1. Ensure Idempotency](#1-ensure-idempotency) [Section titled “1. Ensure Idempotency”](#1-ensure-idempotency) * Rule Before performing non-idempotent operations (e.g., payments), verify the operation hasn’t already been completed. This is crucial for handling potential retries correctly. See [Retry Policies](../15-retry-policies/). * Example ```typescript export default defineWorkflow('charge-customer', async (flow) => { await flow.do('charge customer', async () => { // First, check if already charged const subscription = await fetch( `https://payment.api/subscriptions/${customerId}` ).then((res) => res.json()); if (subscription.charged) { return; // Skip if already charged } // Proceed with charging return await fetch( `https://payment.api/subscriptions/${customerId}`, { method: 'POST', body: JSON.stringify({ amount: 10.0 }), } ); }); }); ``` ## [2. Make Steps Granular](#2-make-steps-granular) [Section titled “2. Make Steps Granular”](#2-make-steps-granular) * Rule Each step should perform a single, cohesive unit of work that can be retried independently. See [Developing Workflows](../guides/30-defining-workflows/) for more on structuring steps. * Example ```typescript export default defineWorkflow('fetch-and-display-cat', async (flow) => { // Separate steps for different operations const catId = await flow.do('fetch cat id from KV', async ({ use }) => { return await use(KV).get('cutest-cat-id'); }); const catImage = await flow.do('fetch cat image', async () => { return await fetch(`https://api.cat-images.com/${catId}`); }); return catImage; }); ``` ## [3. Persist State Through Steps](#3-persist-state-through-steps) [Section titled “3. Persist State Through Steps”](#3-persist-state-through-steps) * Rule Store state exclusively through step outputs rather than local variables that exist only in memory. The engine persists the output of completed steps (`flow.do`). * Example ```typescript export default defineWorkflow('build-cat-gallery', async (flow) => { // Good: Build state from step outputs const catList = await Promise.all([ flow.do('fetch cat 1', async ({ use }) => await use(KV).get('cat-1') ), flow.do('fetch cat 2', async ({ use }) => await use(KV).get('cat-2') ), ]); await flow.sleep('wait before display', '3 hours'); // Use persisted state return await flow.do('display random cat', async () => { const randomCat = catList[Math.floor(Math.random() * catList.length)]; return await fetch(`https://api.cat-images.com/${randomCat}`); }); }); ``` ## [4. Keep Events Immutable](#4-keep-events-immutable) [Section titled “4. Keep Events Immutable”](#4-keep-events-immutable) * Rule Never mutate incoming events. Create and return new state when changes are needed. This aligns with [Event-Sourced Architecture principles](../features/10-event-sourcing/). * Example ```typescript export default defineWorkflow('process-user-event', async (flow) => { // Bad: Mutating event payload await flow.do('bad mutation', async () => { event.payload.user = await use(KV).get(event.payload.user); }); // Good: Return new state const userData = await flow.do('fetch user data', async () => { return await use(KV).get(event.payload.user); }); // Use userData in subsequent steps }); ``` ## [5. Use Deterministic Step Names](#5-use-deterministic-step-names) [Section titled “5. Use Deterministic Step Names”](#5-use-deterministic-step-names) * Rule Give each step a stable, unique identifier rather than one based on timestamps or randomness. This supports [Deterministic Execution](../features/20-deterministic-execution/). * Example ```typescript export default defineWorkflow('process-orders', async (flow) => { // Bad: Non-deterministic name await flow.do(`process order at ${Date.now()}`, async () => { // ... }); // Good: Static name await flow.do('process order', async () => { // ... }); }); ``` ## [Best Practices Summary](#best-practices-summary) [Section titled “Best Practices Summary”](#best-practices-summary) Granularity Divide workflows into small, self-contained steps. Idempotency Check before performing non-idempotent operations. State Management Persist state only via step outputs. Immutability Treat events as immutable, return new state for changes. # Error Handling & Recovery > Master the art of building resilient workflows with comprehensive error handling strategies Retry Policies Configure sophisticated retry strategies with customizable limits, delays, and backoff patterns. [Learn more](../15-retry-policies/) Error Classification Distinguish between transient failures that can be retried and permanent failures that require intervention using custom error types. Logging & Monitoring Utilize built-in logging and OpenTelemetry for observing errors. ## [Error Types](#error-types) [Section titled “Error Types”](#error-types) Understanding the nature of potential errors helps in designing appropriate handling strategies: Transient Failures Temporary issues that may resolve on retry: * Network timeouts * Temporary service unavailability * Rate limiting * Database deadlocks (These should generally be allowed to retry) Permanent Failures Issues that won’t be resolved by retrying: * Invalid input data (`ValidationError`) * Business rule violations (Throw `NonRetryableError`) * Authentication failures (Throw `NonRetryableError`) * Resource not found (Throw `NonRetryableError`) (These should often prevent retries) System Errors Unexpected system-level issues that might cause the workflow to fail: * Out of memory * Process crashes * Engine configuration errors (Often lead to a `FAILED` instance state) ## [Handling Errors in Workflows](#handling-errors-in-workflows) [Section titled “Handling Errors in Workflows”](#handling-errors-in-workflows) Use standard `try...catch` blocks combined with the specific error types exported by `@identity-flow/sdk` for robust error handling. * Catching Specific Errors ```typescript import { defineWorkflow, NonRetryableError, ValidationError, isValidationError, isLockedError } from '@identity-flow/sdk'; export default defineWorkflow('process-payment', async (flow) => { try { await flow.do('charge card', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async () => { // This function might throw different error types return await chargeCard(flow.params); }); flow.log('Payment successful'); } catch (error) { if (isValidationError(error)) { // Handle data validation issues specifically flow.warn('Payment failed due to validation error:', error.message, error.issues); // Perhaps notify user or end workflow gracefully return { status: 'VALIDATION*FAILED', issues: error.issues }; } else if (error instanceof NonRetryableError || isLockedError(error)) { // Handle errors explicitly marked as non-retryable or locked resources flow.error('Permanent error during payment:', error.message, { code: error.code }); // Perform cleanup if needed await flow.do('cleanup-failed-payment', async () => { /* ... \_/ }); return { status: 'FAILED_PERMANENTLY', reason: error.message }; } else { // Assume other errors are transient and might have been retried // This block catches the error after retries are exhausted flow.error('Payment failed after retries:', error.message); // Rethrow to fail the workflow instance throw error; } } }); ``` * Throwing Specific Errors ```typescript import { defineWorkflow, NonRetryableError, ValidationError } from '@identity-flow/sdk'; // Inside an activity function (e.g., used within flow.do) async function chargeCard(params: any) { // ... pre-checks ... if (!params.cardNumber || !params.cvc) { // Throw specific validation error throw new ValidationError('Missing card details', { issues: [{ message: 'Card number and CVC are required' }] }); } try { const response = await paymentGateway.charge(params); if (response.code === 'CARD_DECLINED') { // Throw NonRetryableError for permanent declines throw new NonRetryableError('Card declined by issuer', { code: response.code, cause: response.error // Optionally chain the original error }); } if (!response.success) { // Throw generic error for potentially transient gateway issues (will be retried) throw new Error(`Payment gateway error: ${response.message}`); } return response.transactionId; } catch(error) { // Handle or re-throw other network/unexpected errors if (error instanceof NonRetryableError) throw error; // Don't wrap non-retryable throw new Error('Failed to communicate with payment gateway', { cause: error }); } } ``` ## [Error Monitoring](#error-monitoring) [Section titled “Error Monitoring”](#error-monitoring) Effective monitoring is key to understanding workflow health. ### [Error Logging](#error-logging) [Section titled “Error Logging”](#error-logging) Use the built-in `flow.error()`, `flow.warn()` methods within your workflow logic, especially in `catch` blocks, to log detailed contextual information when errors occur. These logs are associated with the specific workflow instance and step, making debugging easier via the GraphQL API or engine logs. ```typescript export default defineWorkflow('process-order', async (flow) => { try { await flow.do('process payment', async () => { return await processPayment(flow.params); }); } catch (error) { // Log detailed error information flow.error('Payment processing failed', { orderId: flow.params.orderId, error: error.message, code: error.code, // Include custom codes if available stack: error.stack, retryable: !(error instanceof NonRetryableError), // Indicate if it was considered retryable context: { /* Any relevant context */ }, }); // Decide whether to re-throw to fail the workflow throw error; } }); ``` ### [Advanced Observability (OpenTelemetry)](#advanced-observability-opentelemetry) [Section titled “Advanced Observability (OpenTelemetry)”](#advanced-observability-opentelemetry) The IdentityFlow SDK includes re-exports from the `@opentelemetry/api` package under `@identity-flow/sdk/telemetry`. For advanced use cases, you can leverage these APIs to create custom spans or interact with the trace context propagated through `flow.instance.meta`. Setting up and exporting this telemetry data requires integration with an OpenTelemetry collector and backend. Refer to the [OpenTelemetry Documentation](https://opentelemetry.io/docs/) for more details on instrumenting applications. ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Classify Errors * Use `NonRetryableError` for permanent failures. * Throw standard `Error` for transient issues. * Use `ValidationError` for data issues. * Catch specific errors using type guards. Configure Retries * Set appropriate retry limits via `options`. * Use exponential backoff for external services. * Disable retries (`{ retries: false }`) when needed. [See Retry Policies](../15-retry-policies/) Log Contextually * Use `flow.error` and `flow.warn` in `catch` blocks. * Include relevant parameters and error details. * Log decisions made during error handling. Monitor Systematically * Query failed workflow instances via API. * Analyze logs for patterns. * Integrate with external monitoring tools using logs or OpenTelemetry. [See Observability](../40-observability/) ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Retry Policies Learn more about configuring [retry strategies](../15-retry-policies/). Observability Dive deeper into [monitoring and tracing](../40-observability/). Testing Master [testing strategies](../45-testing/). # Retry & Backoff Policies > Configure sophisticated retry strategies for robust error recovery Workflow-Level Retries Set default retry behavior for all steps in a workflow. Step-Level Retries Override retry configuration for individual steps. Backoff Strategies Choose from different retry delay patterns. ## [Retry Configuration](#retry-configuration) [Section titled “Retry Configuration”](#retry-configuration) * Workflow Level ```typescript export default defineWorkflow('payment-processing', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async (flow) => { // Step uses workflow's retry policy await flow.do('process payment', async () => { return processPayment(flow.params); }); }); ``` * Step Level ```typescript export default defineWorkflow('payment-processing', async (flow) => { // Step overrides retry policy await flow.do('send receipt', { retries: { limit: 5, delay: '1 minute', backoff: 'linear' } }, async () => { return sendReceipt(flow.params); }); }); ``` * Mixed Levels ```typescript export default defineWorkflow('order-processing', { // Default retry policy retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async (flow) => { // Uses default policy await flow.do('validate order', async () => { return validateOrder(flow.params); }); // Overrides default policy await flow.do('process payment', { retries: { limit: 5, delay: '1 minute', backoff: 'exponential' } }, async () => { return processPayment(flow.params); }, ); // Disables retries await flow.do('send notification', { retries: false }, async () => { return sendNotification(flow.params); }, ); }); ``` ## [Backoff Strategies](#backoff-strategies) [Section titled “Backoff Strategies”](#backoff-strategies) Constant Fixed delay between retries. Best for predictable operations. ```typescript backoff: 'constant' // delay: 30s, 30s, 30s ``` Linear Delay increases linearly. Good for gradually increasing wait times. ```typescript backoff: 'linear' // delay: 30s, 60s, 90s ``` Exponential Delay increases exponentially. Best for handling transient failures. ```typescript backoff: 'exponential' // delay: 30s, 60s, 120s ``` ## [Error Classification](#error-classification) [Section titled “Error Classification”](#error-classification) Transient Failures Temporary issues like network timeouts or service unavailability. These should be retried. Permanent Failures Unrecoverable errors like invalid input or business rule violations. These should not be retried. ### [Example: Error Classification](#example-error-classification) [Section titled “Example: Error Classification”](#example-error-classification) See the main [Error Handling](../10-error-handling/) page for more on classifying errors. ```typescript export default defineWorkflow('payment-processing', async (flow) => { await flow.do('process payment', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async () => { try { return await processPayment(flow.params); } catch (error) { if (error.code === 'INVALID_CARD') { // Permanent failure - don't retry throw new NonRetryableError(error); } // Transient failure - will be retried throw error; } }); }); ``` ## [Circuit Breaker Pattern](#circuit-breaker-pattern) [Section titled “Circuit Breaker Pattern”](#circuit-breaker-pattern) IdentityFlow supports the circuit breaker pattern via step options to prevent repeated calls to failing services. See [Step Options in API Reference](../80-api-reference/#step-options) for configuration details. ```typescript export default defineWorkflow('api-integration', async (flow) => { await flow.do( 'external api call', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' }, }, async () => { return await externalApiCall(flow.params); }, ); }); ``` ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Choose Appropriate Limits Set reasonable retry limits based on operation type and consider downstream service limits. Configure Smart Delays Start with shorter delays for quick recovery and use longer delays for resource-intensive operations. Select the Right Strategy Use constant backoff for predictable operations, linear for gradual scaling, and exponential for transient failures. ## [Complete Example](#complete-example) [Section titled “Complete Example”](#complete-example) ```typescript export default defineWorkflow( 'order-processing', { // Default retry policy for all steps retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, }, async (flow) => { // Uses default retry policy const order = await flow.do('validate order', async () => { return validateOrder(flow.params); }); // Custom retry policy for critical operation const payment = await flow.do( 'process payment', { retries: { limit: 5, delay: '1 minute', backoff: 'exponential' } }, async () => { return processPayment(order); }, ); // Minimal retries for notification await flow.do( 'send notification', { retries: { limit: 2, delay: '10 seconds', backoff: 'constant' } }, async () => { return sendNotification(payment); }, ); }, ); ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Error Handling Learn more about [error handling strategies](../10-error-handling/). Observability Explore [observability features](../40-observability/) for monitoring retries. Testing Master [testing retry scenarios](../45-testing/). # Data Validation & Schema Usage > Ensure data integrity with standardized schema validation throughout your workflows IdentityFlow embraces the [Standard Schema](https://standardschema.dev/) specification, a common interface for TypeScript validation libraries. This approach allows you to use your preferred validation tools, like Valibot, Zod, and ArkType (all of which are Standard Schema compliant), ensuring flexibility and interoperability. Learn more at [standardschema.dev](https://standardschema.dev/). Schema Validation Use [Standard Schema](https://standardschema.dev/) to validate workflow inputs and outputs with your preferred validation library. Type Safety Get full TypeScript type inference from your validation schemas. See [Type-Safe Development](../40-features/40-typesafe-development/). Multiple Libraries Choose from popular validation libraries like Valibot, Zod, and more. ## [Example Usage](#example-usage) [Section titled “Example Usage”](#example-usage) * Valibot Lightweight validation library with excellent TypeScript support (v1.0+). ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; const UserSchema = v.object({ id: v.string([v.uuid()]), email: v.string([v.email()]), role: v.union([v.literal('admin'), v.literal('user')]), }); export default defineWorkflow( 'user-onboarding', { schema: UserSchema, // Validate workflow input }, async (flow) => { const user = flow.params; // Typed as User await flow.do('create user', async () => { await createUser(user); }); }, ); ``` * Zod Popular schema validation with strong type inference (3.24.0+). * npm ```sh npm i zod ``` * pnpm ```sh pnpm add zod ``` * yarn ```sh yarn add zod ``` ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import { z } from 'zod'; const UserSchema = z.object({ id: z.string().uuid(), email: z.string().email(), role: z.enum(['admin', 'user']), }); export default defineWorkflow('user-onboarding', { schema: UserSchema }, async (flow) => { const user = flow.params; // Typed as User await flow.do('create user', async () => { await createUser(user); }); }); ``` * ArkType ```sh npm i zod ``` * npm ```sh pnpm add zod ``` * pnpm ```sh yarn add zod ``` * yarn High-performance validation with runtime type checking (v2.0+). * npm ```sh npm i arktype ``` * pnpm ```sh pnpm add arktype ``` * yarn ```sh yarn add arktype ``` ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import { type } from 'arktype'; const UserSchema = type({ id: 'string', // For specific formats like UUID, ArkType uses constraints (see ArkType docs). email: 'string', // For specific formats like email, ArkType uses constraints (see ArkType docs). role: "'admin' | 'user'", }); export default defineWorkflow('user-onboarding', { schema: UserSchema }, async (flow) => { const user = flow.params; // Typed as User await flow.do('create user', async () => { await createUser(user); }); }); ``` * npm ```sh npm i arktype ``` * pnpm ```sh pnpm add arktype ``` * yarn ```sh yarn add arktype ``` ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Validate Early Add schemas to workflow definitions to catch invalid data before processing starts. Type Safety Let TypeScript infer types from schemas for compile-time checks. Consistent Validation Use the same schemas across related workflows to maintain consistency. ## [Step-Level Validation](#step-level-validation) [Section titled “Step-Level Validation”](#step-level-validation) ```typescript export default defineWorkflow('order-processing', async (flow) => { await flow.do('validate order', { schema: OrderSchema }, async (input) => { // input is typed from OrderSchema return processOrder(input); }); }); ``` ## [Error Handling](#error-handling) [Section titled “Error Handling”](#error-handling) See the main [Error Handling](../10-error-handling/) guide for detailed strategies. * Basic Validation ```typescript export default defineWorkflow('user-registration', async (flow) => { try { await flow.do('create user', { schema: UserSchema }, async (input) => { return createUser(input); }); } catch (error) { if (error instanceof ValidationError) { flow.error('Invalid user data', { errors: error.errors }); } throw error; } }); ``` * Custom Validation ```typescript export default defineWorkflow('payment-processing', async (flow) => { await flow.do('process payment', { validate: async (input) => { if (input.amount <= 0) { throw new ValidationError('Amount must be positive'); } if (input.currency !== 'USD') { throw new ValidationError('Only USD is supported'); } return input; } }, async (input) => { return processPayment(input); }); }); ``` ## [Schema Composition](#schema-composition) [Section titled “Schema Composition”](#schema-composition) ```typescript const AddressSchema = v.object({ street: v.string(), city: v.string(), country: v.string(), postal: v.string(), }); const UserSchema = v.object({ id: v.string([v.uuid()]), email: v.string([v.email()]), addresses: v.array(AddressSchema), }); export default defineWorkflow('user-registration', { schema: UserSchema }, async (flow) => { // flow.params is typed with nested address array await flow.do('create user', async () => { return createUser(flow.params); }); }); ``` # TypeScript Integration > Leverage TypeScript for type-safe workflow development Type-Safe Development Define workflows with strong typing for parameters, variables, and results, enhancing reliability and maintainability. [Schema Inference ](../20-data-validation/) SDK Type Exports Utilize the comprehensive types exported by `@identity-flow/sdk` for building robust integrations. ## [Leveraging TypeScript for Robust Workflows](#leveraging-typescript-for-robust-workflows) [Section titled “Leveraging TypeScript for Robust Workflows”](#leveraging-typescript-for-robust-workflows) IdentityFlow is built with TypeScript, providing strong typing throughout the SDK to help you catch errors early and build more reliable workflows. * Typed Parameters & Results ```typescript import { defineWorkflow } from '@identity-flow/sdk'; // Define interfaces for your specific workflow's parameters and expected result interface OrderParams { id: string; items: Array<{ productId: string; quantity: number; }>; shipping: { address: string; method: 'standard' | 'express'; }; } interface ProcessResult { orderId: string; status: 'COMPLETED' | 'PENDING' | 'FAILED'; // Consistent status values total: number; } // Use generics with defineWorkflow to type params and the final return value export default defineWorkflow( { name: 'process-order' }, async (flow) => { // flow.params is automatically typed as OrderParams const { id, items, shipping } = flow.params; flow.log(`Processing order ${id} with ${items.length} items.`); // The final value returned must match ProcessResult const result: ProcessResult = await flow.do('process order step', async () => { const externalResult = await processOrder({ orderId: id, items, shippingMethod: shipping.method }); // Mapping external result to the defined ProcessResult type return { orderId: id, status: externalResult.status === 'success' ? 'COMPLETED' : 'FAILED', total: externalResult.total }; }); return result; // Type checked against ProcessResult } ); ``` * Type Inference from Schemas ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import * as v from '@identity-flow/sdk/valibot'; // Using built-in Valibot // Define a schema using a Standard Schema compatible library (like Valibot) const OrderItemSchema = v.object({ productId: v.string(), quantity: v.number([v.positive()]) }); const OrderParamsSchema = v.object({ id: v.string([v.uuid()]), items: v.array(OrderItemSchema), customerEmail: v.optional(v.string([v.email()])) }); // Define workflow, providing the schema for input parameters export default defineWorkflow({ name: 'fulfill-order', version: '1.1.0', schema: OrderParamsSchema // SDK uses this schema to validate and infer types }, async (flow) => { // flow.params is automatically inferred as the output type of OrderParamsSchema // You get type safety and autocompletion without explicit generics here! const { id, items, customerEmail } = flow.params; flow.log(`Fulfilling order ${id}`); if (customerEmail) { flow.log(`Notifying customer: ${customerEmail}`) } // You can also use schemas to type activity results const shippingInfoSchema = v.object({ trackingId: v.string(), carrier: v.string() }); const shippingResult = await flow.do( 'arrange-shipping', { schema: shippingInfoSchema }, // Validate and type the step result async () => { // ... logic to arrange shipping ... return { trackingId: 'ABC123XYZ', carrier: 'FlowEx' }; // Must match schema } ); // shippingResult is typed based on shippingInfoSchema flow.log(`Shipment arranged: ${shippingResult.carrier} / ${shippingResult.trackingId}`); return { status: 'Fulfilled', tracking: shippingResult.trackingId }; }); ``` ## [Advanced Type Features](#advanced-type-features) [Section titled “Advanced Type Features”](#advanced-type-features) TypeScript’s advanced features can be useful within workflow logic. ### [Conditional Types](#conditional-types) [Section titled “Conditional Types”](#conditional-types) Conditional types can help create flexible return types based on inputs or intermediate results within your workflow steps. ```typescript // Example: Define a type that varies based on whether data is present type StepCompletion = TData extends undefined ? { status: 'COMPLETED' } : { status: 'COMPLETED'; data: TData }; async function executeOptionalStep( task: () => Promise, ): Promise> { const result = await task(); return result === undefined ? { status: 'COMPLETED' } : { status: 'COMPLETED', data: result }; // Type safety ensures data is present } // Usage in workflow: const optionalDataResult = await flow.do('optional-step', async () => { return await executeOptionalStep(async () => { // ... logic that might or might not return data ... return Math.random() > 0.5 ? { value: 123 } : undefined; }); }); if ('data' in optionalDataResult) { // TS knows optionalDataResult has a data property here flow.log('Optional step produced data:', optionalDataResult.data.value); } ``` ### [Type Guards for Error Handling](#type-guards-for-error-handling) [Section titled “Type Guards for Error Handling”](#type-guards-for-error-handling) Use the type guard functions exported by the SDK (`isNonRetryableError`, `isValidationError`, etc.) within `try...catch` blocks to safely narrow down error types and handle different failure scenarios appropriately. See [Error Handling](../10-error-handling/) for more strategies. ```typescript import { NonRetryableError, ValidationError, isNonRetryableError, isValidationError, } from '@identity-flow/sdk'; async function handleWorkflowError(error: unknown) { if (isValidationError(error)) { // TypeScript knows `error` is `ValidationError` here console.warn('Workflow encountered validation issues:', error.issues); // Specific handling for validation problems } else if (isNonRetryableError(error)) { // TypeScript knows `error` is `NonRetryableError` here console.error('Workflow encountered a non-retryable error:', error.message, { code: error.code, }); // Specific handling for permanent failures } else { // Handle other unknown or potentially transient errors console.error('An unexpected error occurred during workflow:', error); // Potentially re-throw to let the engine handle retries/failure throw error; } } // Example usage in workflow try { await flow.do('critical-step', async () => { /* ... */ }); } catch (err) { await handleWorkflowError(err); } ``` ## [Best Practices Summary](#best-practices-summary) [Section titled “Best Practices Summary”](#best-practices-summary) Use Generics or Schemas Define input/output types for workflows using `` generics or the `schema` option for clarity and safety. [Leverage Schema Inference ](../20-data-validation/) [Typed Bindings ](../30-integrations/) [Use Type Guards ](../10-error-handling/) Enable Strict Mode Use strict TypeScript compiler options (`tsconfig.json`) for maximum compile-time safety. ## [Type-Safe External Services](#type-safe-external-services) [Section titled “Type-Safe External Services”](#type-safe-external-services) Define clear TypeScript interfaces for any external services your workflows interact with. Use these interfaces when creating your binding functions for `flow.use` to ensure type safety during integration. ```typescript // Define a clear interface for your external service interface PaymentService { processPayment( amount: number, currency: string, orderId: string, ): Promise<{ transactionId: string; status: 'success' | 'failed' }>; } // Create a typed binding function that returns an instance implementing the interface const PaymentClientBinding = (): PaymentService => { // In a real scenario, initialize and return your actual payment client instance here console.log('Initializing Payment Client Binding...'); return { processPayment: async (amount, currency, orderId) => { console.log(`Mock charging ${amount} ${currency} for order ${orderId}`); // Simulate API call await new Promise((resolve) => setTimeout(resolve, 50)); return { transactionId: `txn_${Date.now()}`, status: 'success' }; }, }; }; // Use the typed binding in your workflow export default defineWorkflow<{ amount: number; currency: string; orderId: string }>( { name: 'charge-customer' }, async (flow) => { // paymentService is correctly typed as PaymentService const paymentService = flow.use(PaymentClientBinding); const paymentResult = await flow.do('process-payment-via-service', async () => { // Type checking ensures correct arguments are passed return paymentService.processPayment( flow.params.amount, flow.params.currency, flow.params.orderId, ); }); // paymentResult is correctly typed based on PaymentService interface flow.log(`Payment status: ${paymentResult.status}, ID: ${paymentResult.transactionId}`); return paymentResult; }, ); ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Developing Workflows Explore the main guide on [Developing Workflows](../../guides/30-defining-workflows/) for detailed API usage. Data Validation Learn more about using [schemas for validation](../20-data-validation/). Error Handling See examples of using type guards in [error handling](../10-error-handling/). # External Service Integration > Connect workflows to APIs and databases with lazy-loaded dependencies [Lazy Loading ](../features/30-simple-api/) Connection Pooling Optimize resource usage with smart connection management. [Type Safety ](../features/40-typesafe-development/) ## [Available Bindings](#available-bindings) [Section titled “Available Bindings”](#available-bindings) GraphQL Client Type-safe GraphQL operations with automatic query validation. HTTP Client Configurable HTTP client with retry and circuit breaker support. Database Clients Optimized database access with connection pooling. Message Queues Reliable message queue integration for async operations. ## [Client Configuration](#client-configuration) [Section titled “Client Configuration”](#client-configuration) Configuration options often include settings for reliability features like retries and circuit breakers. Learn more in [Retry Policies](../15-retry-policies/). * GraphQL ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import { createGraphqlClient } from '@identity-flow/sdk/bindings/graphql'; const GraphqlClient = () => createGraphqlClient({ url: 'https://api.example.com/graphql', headers: { 'Authorization': 'Bearer ${process.env.API_TOKEN}' }, retries: { limit: 3, backoff: 'exponential' }, circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' }, caching: { ttl: '5 minutes', staleWhileRevalidate: true } }); See [Retry Policies](../15-retry-policies/) for detailed configuration. ``` * HTTP ```typescript const HttpClient = () => createHttpClient({ baseURL: 'https://api.example.com', timeout: '30 seconds', retries: { limit: 3, backoff: 'exponential' }, circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' }, keepAlive: true, compression: true, agent: { maxSockets: 100, keepAlive: true, timeout: 60000 } }); ``` * Database ```typescript const DbClient = () => createDbClient({ poolSize: 10, minConnections: 2, maxIdleTime: '5 minutes', retries: { limit: 3, backoff: 'exponential' }, circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' }, queryTimeout: '30 seconds', ssl: true, monitoring: { metrics: true, slowQueryThreshold: '1 second' } }); ``` * Queue ```typescript const QueueClient = () => createQueueClient({ prefetch: 10, retries: { limit: 3, backoff: 'exponential' }, deadLetter: { exchange: 'dlx', routingKey: 'dead-letter' }, monitoring: { metrics: true, healthCheck: { interval: '30 seconds', timeout: '5 seconds' } } }); ``` ## [Usage Examples](#usage-examples) [Section titled “Usage Examples”](#usage-examples) * GraphQL ```typescript export default defineWorkflow('user-data-sync', async (flow) => { const client = flow.use(GraphqlClient); // Type-safe query execution const user = await flow.do('fetch user data', async () => { return client.request<{ user: User }>(` query GetUser($id: ID!) { user(id: $id) { id name email profile { avatar bio } } } `, { id: flow.params.userId }); }); // Process results await flow.do('update local data', async () => { await updateUserData(user.data.user); }); }); ``` * Database ```typescript export default defineWorkflow('data-processing', async (flow) => { const db = flow.use(DbClient); // Transaction handling await flow.do('process data', async () => { const result = await db.transaction(async (tx) => { // Query with parameterized values const user = await tx.query( 'SELECT \* FROM users WHERE id = $1', [flow.params.userId] ); // Batch operations await tx.batch([ ['UPDATE users SET last_login = NOW() WHERE id = $1', [user.id]], ['INSERT INTO audit_log (user_id, action) VALUES ($1, $2)', [user.id, 'login']] ]); return user; }); return result; }); }); ``` * Queue ```typescript export default defineWorkflow('event-processing', async (flow) => { const queue = flow.use(QueueClient); // Publish with confirmation await flow.do('publish event', async () => { await queue.publish('events', { type: 'user.created', data: flow.params, timestamp: new Date().toISOString() }, { persistent: true, priority: 1, expiration: '24h' }); }); // Consume messages await flow.do('process messages', async () => { await queue.consume('events', async (msg) => { try { await processMessage(msg); await msg.ack(); } catch (error) { await msg.nack({ requeue: true }); } }); }); }); ``` * HTTP ```typescript export default defineWorkflow('api-integration', async (flow) => { const client = flow.use(HttpClient); // Request with automatic retries const response = await flow.do('fetch data', async () => { return client.get('/users', { params: { id: flow.params.userId }, headers: { 'Accept': 'application/json', 'X-Request-ID': flow.instance.id }, validateStatus: (status) => status === 200 }); }); // Handle response await flow.do('process response', async () => { if (response.data.status === 'success') { await processUserData(response.data.user); } }); }); ``` ## [Error Handling](#error-handling) [Section titled “Error Handling”](#error-handling) ### [Circuit Breaker Pattern](#circuit-breaker-pattern) [Section titled “Circuit Breaker Pattern”](#circuit-breaker-pattern) ```typescript export default defineWorkflow('resilient-integration', async (flow) => { const client = flow.use(HttpClient); await flow.do( 'external api call', { circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute', halfOpenMaxCalls: 1 } }, async () => { try { return await client.get('/api/data'); } catch (error) { if (error.code === 'RATE_LIMITED') { // Add retry delay hint error.retryAfter = '60 seconds'; } throw error; } }, ); }); ``` ### [Fallback Strategies](#fallback-strategies) [Section titled “Fallback Strategies”](#fallback-strategies) ```typescript export default defineWorkflow('reliable-integration', async (flow) => { const primary = flow.use(HttpClient); const backup = flow.use(BackupClient); await flow.do( 'fetch data', { fallback: async (error) => { if (error.code === 'SERVICE_UNAVAILABLE') { // Use backup service return await backup.get('/api/data'); } throw error; }, }, async () => { return await primary.get('/api/data'); }, ); }); ``` ## [Resource Management](#resource-management) [Section titled “Resource Management”](#resource-management) Connection Pooling Configure appropriate pool sizes and timeouts for optimal resource usage. Circuit Breakers Protect your system from cascading failures with circuit breakers. Rate Limiting Implement rate limiting to respect API quotas and prevent overload. ### [Example: Resource Configuration](#example-resource-configuration) [Section titled “Example: Resource Configuration”](#example-resource-configuration) ```typescript const DbClient = () => createDbClient({ // Connection pool settings poolSize: 10, minConnections: 2, maxIdleTime: '5 minutes', // Circuit breaker circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' }, // Rate limiting rateLimit: { maxRequests: 1000, perInterval: '1 minute' }, // Monitoring monitoring: { metrics: true, slowQueryThreshold: '1 second', healthCheck: { interval: '30 seconds' }, }, }); ``` ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Lazy Loading Use bindings to defer service initialization and load dependencies only when needed. Resource Management Configure appropriate pool sizes and timeouts for optimal resource usage. Error Handling Implement proper retry strategies and circuit breakers for failing services. Monitoring Enable metrics and health checks for all external services. ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Error Handling Learn more about [error handling](../10-error-handling/) strategies. Performance Explore [performance optimization](../35-performance/) techniques. Monitoring Set up [observability](../40-observability/) for your integrations. ```plaintext ``` # Performance Optimization > Learn how to optimize workflow performance at scale [Parallel Processing ](../50-parallel-processing/) [Resource Management ](../55-resource-management/) Caching Strategies Reduce redundant operations with intelligent caching. ## [Parallel Processing](#parallel-processing) [Section titled “Parallel Processing”](#parallel-processing) Refer to the main [Parallel Processing](../50-parallel-processing/) page for more details and considerations. * Basic Parallel ```typescript export default defineWorkflow('batch-processing', async (flow) => { const { items } = flow.params; // Process items in parallel with concurrency control const results = await Promise.all( items.map(item => flow.do(`process item ${item.id}`, async () => { return processItem(item); }) ) ); return results; }); ``` * Grouped Parallel ```typescript export default defineWorkflow('order-processing', async (flow) => { // Run multiple independent operations in parallel const [payment, inventory, shipping] = await Promise.all([ flow.do('process payment', async () => { return processPayment(flow.params); }), flow.do('check inventory', async () => { return checkInventory(flow.params.items); }), flow.do('calculate shipping', async () => { return calculateShipping(flow.params.address); }) ]); return { payment, inventory, shipping }; }); ``` * Dynamic Parallel ```typescript export default defineWorkflow('dynamic-batch', async (flow) => { const { items } = flow.params; // Dynamically adjust concurrency based on load const batchSize = calculateOptimalBatchSize(items.length); // Process in batches to control memory usage const results = []; for (let i = 0; i < items.length; i += batchSize) { const batch = items.slice(i, i + batchSize); const batchResults = await Promise.all( batch.map(item => flow.do(`process item ${item.id}`, async () => { return processItem(item); }) ) ); results.push(...batchResults); } return results; }); ``` ## [Resource Management](#resource-management) [Section titled “Resource Management”](#resource-management) Efficient resource management is crucial for performance and stability. See also the [Resource Management](../55-resource-management/) deep dive and the [Integrations](../30-integrations/) page for client configurations. Connection Pooling Optimize database and API connections with smart pooling strategies. Resource Limits Set appropriate limits to prevent resource exhaustion. [Lazy Loading ](../features/30-simple-api/) ### [Database Optimization](#database-optimization) [Section titled “Database Optimization”](#database-optimization) See [Integrations](../30-integrations/) for full client configuration details. * Connection Pool ```typescript const DbClient = () => createDbClient({ // Connection pool settings poolSize: 10, minConnections: 2, maxIdleTime: '5 minutes', // Query timeout queryTimeout: '30 seconds', // Circuit breaker circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' } }); export default defineWorkflow('data-processing', async (flow) => { const db = flow.use(DbClient); await flow.do('process data', async () => { return db.query('SELECT * FROM data'); }); }); ``` * Batch Queries ```typescript export default defineWorkflow('batch-processing', async (flow) => { const db = flow.use(DbClient); const { items } = flow.params; // Use batch operations instead of individual queries await flow.do('insert data', async () => { return db.batchInsert('items', items, { batchSize: 1000 }); }); }); ``` ### [HTTP Client Optimization](#http-client-optimization) [Section titled “HTTP Client Optimization”](#http-client-optimization) See [Integrations](../30-integrations/) for full client configuration details. Note the retry and circuit breaker settings, detailed further in [Retry Policies](../15-retry-policies/). ```typescript const HttpClient = () => createHttpClient({ // Connection settings maxConnections: 100, keepAlive: true, timeout: '30 seconds', // Compression compression: true, // Connection pooling agent: { keepAlive: true, maxSockets: 100, maxFreeSockets: 10, timeout: 60000 }, // Retry configuration retries: { limit: 3, backoff: 'exponential' }, // Circuit breaker circuitBreaker: { failureThreshold: 5, resetTimeout: '1 minute' } }); ``` ## [Caching Strategies](#caching-strategies) [Section titled “Caching Strategies”](#caching-strategies) * Result Caching ```typescript export default defineWorkflow('data-processing', async (flow) => { // Cache results based on input parameters const data = await flow.do('fetch data', { cache: { ttl: '1 hour', key: flow.params.id, tags: ['user-data'], staleWhileRevalidate: '5 minutes' } }, async () => { return fetchExpensiveData(flow.params); }); return data; }); ``` * Conditional Caching ```typescript export default defineWorkflow('dynamic-caching', async (flow) => { await flow.do('process data', { cache: { ttl: flow.params.priority === 'high' ? '5 minutes' : '1 hour', key: `${flow.params.id}-${flow.params.priority}`, condition: (params) => params.cacheable !== false, onError: 'stale' // Use stale data on error } }, async () => { return processData(flow.params); }); }); ``` * Cache Invalidation ```typescript export default defineWorkflow('cache-management', async (flow) => { const cache = flow.use(CacheClient); // Invalidate specific cache entries await flow.do('update data', async () => { await updateData(flow.params); await cache.invalidate({ tags: ['user-data', flow.params.userId] }); }); }); ``` ```plaintext ``` # Observability & Monitoring > Gain deep insights with OpenTelemetry integration and detailed traces [OpenTelemetry ](../features/70-opentelemetry-integration/)Built-in OpenTelemetry integration for comprehensive tracing. Metrics Track key workflow metrics for performance and reliability. [Logging ](../features/60-error-tracing/)Detailed logging capabilities for debugging and auditing. ## [OpenTelemetry Integration](#opentelemetry-integration) [Section titled “OpenTelemetry Integration”](#opentelemetry-integration) * Basic Tracing ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import { trace } from '@identity-flow/sdk/telemetry'; export default defineWorkflow('payment-processing', async (flow) => { // Each step automatically creates a span await flow.do('validate payment', async ({ span }) => { // Add custom attributes to the span span.setAttribute('payment.amount', flow.params.amount); return validatePayment(flow.params); }); }); ``` * Custom Spans ```typescript export default defineWorkflow('order-processing', async (flow) => { await flow.do('process order', async ({ span }) => { // Create nested spans for detailed tracing const validateSpan = span.createSpan('validate order'); try { await validateOrder(flow.params); validateSpan.setStatus({ code: SpanStatusCode.OK }); } catch (error) { validateSpan.setStatus({ code: SpanStatusCode.ERROR, message: error.message }); throw error; } finally { validateSpan.end(); } }); }); ``` * Baggage ```typescript export default defineWorkflow('user-flow', async (flow) => { await flow.do('process user', async ({ span }) => { // Add context that flows with the trace span.setBaggage('user.id', flow.params.userId); span.setBaggage('tenant.id', flow.params.tenantId); return processUser(flow.params); }); }); ``` ## [Metrics](#metrics) [Section titled “Metrics”](#metrics) Execution Time Track workflow and step execution duration. Success Rates Monitor success and failure rates. Resource Usage Track system resource utilization. Custom Metrics Define and track business-specific metrics. ### [Metric Types](#metric-types) [Section titled “Metric Types”](#metric-types) * Counters ```typescript import { metrics } from '@identity-flow/sdk/telemetry'; const executionCounter = metrics.createCounter('workflow_executions_total', { description: 'Total number of workflow executions', labels: ['workflow', 'status'] }); export default defineWorkflow('monitored-process', async (flow) => { try { await flow.do('process', async () => { // Process logic }); executionCounter.add(1, { workflow: flow.definition.name, status: 'success' }); } catch (error) { executionCounter.add(1, { workflow: flow.definition.name, status: 'error' }); throw error; } }); ``` * Histograms ```typescript const executionTime = metrics.createHistogram('workflow_execution_time', { description: 'Workflow execution duration', buckets: [0.1, 0.5, 1, 2, 5, 10], // seconds labels: ['workflow'] }); export default defineWorkflow('timed-process', async (flow) => { const start = Date.now(); try { await flow.do('process', async () => { // Process logic }); } finally { executionTime.record((Date.now() - start) / 1000, { workflow: flow.definition.name }); } }); ``` * Gauges ```typescript const activeWorkflows = metrics.createGauge('workflow_active_count', { description: 'Number of currently active workflows', labels: ['type'] }); export default defineWorkflow('tracked-process', async (flow) => { activeWorkflows.inc({ type: flow.definition.name }); try { await flow.do('process', async () => { // Process logic }); } finally { activeWorkflows.dec({ type: flow.definition.name }); } }); ``` ## [Logging](#logging) [Section titled “Logging”](#logging) Detailed logging is essential for debugging and understanding workflow execution. IdentityFlow provides standard logging methods (`flow.log`, `flow.info`, `flow.warn`, `flow.error`, etc.) that automatically associate log messages with the current workflow instance and step. See the [Error Tracing feature page](../features/60-error-tracing/) for how logs contribute to debugging. * Basic Logging ```typescript export default defineWorkflow('order-processing', async (flow) => { flow.debug('Starting order processing', { orderId: flow.params.orderId }); try { await flow.do('process order', async () => { flow.info('Processing order details', { items: flow.params.items.length }); return processOrder(flow.params); }); } catch (error) { flow.error('Order processing failed', { error: error.message }); throw error; } }); ``` * Structured Logging ```typescript export default defineWorkflow('payment-processing', async (flow) => { await flow.do('process payment', async ({ log }) => { log.info('Processing payment', { amount: flow.params.amount, currency: flow.params.currency, method: flow.params.method, timestamp: new Date().toISOString() }); try { const result = await processPayment(flow.params); log.info('Payment processed', { transactionId: result.id, status: result.status }); return result; } catch (error) { log.error('Payment failed', { error: error.message, code: error.code, retryable: error.retryable }); throw error; } }); }); ``` * Context Logging ```typescript export default defineWorkflow('user-management', async (flow) => { // Set context for all logs in this workflow flow.setLogContext({ userId: flow.params.userId, tenantId: flow.params.tenantId, environment: process.env.NODE_ENV }); await flow.do('update user', async ({ log }) => { // Context is automatically included in all logs log.info('Updating user profile'); return updateUser(flow.params); }); }); ``` ## [Monitoring Integration](#monitoring-integration) [Section titled “Monitoring Integration”](#monitoring-integration) ### [Prometheus Integration](#prometheus-integration) [Section titled “Prometheus Integration”](#prometheus-integration) ```typescript import { metrics } from '@identity-flow/sdk/telemetry'; // Define metrics const executionTime = metrics.createHistogram('workflow_execution_time', { description: 'Workflow execution duration', buckets: [0.1, 0.5, 1, 2, 5, 10] }); const errorCount = metrics.createCounter('workflow_errors_total', { description: 'Total number of workflow errors' }); // Use in workflow export default defineWorkflow('monitored-process', async (flow) => { const start = Date.now(); try { await flow.do('process', async () => { // Process logic }); } catch (error) { errorCount.add(1, { workflow: flow.definition.name, error: error.code }); throw error; } finally { executionTime.record((Date.now() - start) / 1000, { workflow: flow.definition.name }); } }); ``` ### [Alert Configuration](#alert-configuration) [Section titled “Alert Configuration”](#alert-configuration) ```typescript // Define alerts metrics.createGauge('workflow_duration_seconds', { description: 'Workflow execution duration', alerting: { rules: [ { name: 'LongRunningWorkflow', condition: 'workflow_duration_seconds > 300', duration: '5m', severity: 'warning', annotations: { summary: 'Workflow running longer than 5 minutes' }, }, ], }, }); // Error rate alerting metrics.createGauge('workflow_error_rate', { description: 'Workflow error rate', alerting: { rules: [ { name: 'HighErrorRate', condition: 'rate(workflow_errors_total[5m]) > 0.1', severity: 'critical', annotations: { summary: 'High workflow error rate detected' }, }, ], }, }); ``` ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Consistent Logging Use structured logging with consistent fields across workflows. Meaningful Traces Add relevant context to spans and logs for easier debugging. Performance Metrics Track key performance indicators for workflow optimization. Error Context Include detailed context in error logs for quick resolution. ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Error Handling Learn more about [error handling](../10-error-handling/) strategies. Performance Explore [performance optimization](../35-performance/) techniques. Testing Master [testing strategies](../45-testing/) for your workflows. # Testing Strategies > Learn to test workflows effectively with our comprehensive testing utilities Testing Utilities Use our testing utilities to validate workflow behavior with ease. Dependency Mocking Mock external services for reliable and isolated testing. Workflow Assertions Comprehensive assertions for workflow states and outcomes. ## [Testing Utilities](#testing-utilities) [Section titled “Testing Utilities”](#testing-utilities) * Basic Test ```typescript import { useWorkflowEngine } from '@identity-flow/sdk/testing'; import { describe } from 'vitest'; describe.concurrent('Workflows', () => { const test = useWorkflowEngine({ workflows: { path: new URL('./workflows', import.meta.url).href }, }); test('starts and runs a workflow', async ({ engine, expect, task, waitFor }) => { const workflow = await engine.defineWorkflow(task.id, async (flow) => { const { params } = flow; const result = await flow.do('First step', async () => { await scheduler.wait(50); // Wait shortly before continuing return { output: 'First step result' }; }); await flow.sleep('Wait', '0.1 seconds'); const result2 = await flow.do('Second step', () => { return { output: 'Second step result' }; }); return [result, result2, flow.instance.createdAt, params]; }); const instance = await engine.start(workflow.id, { params: undefined }); await waitFor(instance).toHaveStep({ name: 'Wait', status: 'SLEEPING' }); await waitFor(instance).toHaveStep({ name: 'Second step', status: 'COMPLETED' }); await waitFor(instance).toMatch({ status: 'COMPLETED', data: [ { output: 'First step result' }, { output: 'Second step result' }, expect.any(Date), undefined, ], }); }); }); ``` * Mocking Dependencies Learn more about dependency injection in [Integrations](../30-integrations/). ```typescript test('mocks external service', async ({ engine, expect }) => { const mockClient = { request: vi.fn().mockResolvedValue({ data: { user: { id: 1 } } }) }; const workflow = await engine.defineWorkflow('test', { bindings: { GraphqlClient: () => mockClient } }, async (flow) => { const client = flow.use(GraphqlClient); return await client.request('query { user { id } }'); }); const instance = await engine.start(workflow.id); await expect(instance).toComplete(); expect(mockClient.request).toHaveBeenCalled(); }); ``` ## [Testing Features](#testing-features) [Section titled “Testing Features”](#testing-features) Pure Function Design Test workflows like pure functions with predictable outcomes. Injectable Dependencies Easy mocking of external services for isolated testing. Async Support Test asynchronous flows with built-in async/await support. Step Validation Validate step-by-step execution with detailed assertions. ## [Common Test Scenarios](#common-test-scenarios) [Section titled “Common Test Scenarios”](#common-test-scenarios) * Step Completion ```typescript await waitFor(instance).toHaveStep({ name: 'process payment', status: 'COMPLETED' }); ``` * Error Handling See [Error Handling](../10-error-handling/) for strategies. ```typescript await expect(instance).toError({ step: 'validate input', error: expect.stringContaining('Invalid input') }); ``` * Timeout Behavior See [Complex State Management](../65-complex-state-management/) for timeout patterns. ```typescript await waitFor(instance).toHaveStep({ name: 'long operation', status: 'timeout' }); ``` * State Transitions ```typescript await waitFor(instance).toMatch({ status: 'COMPLETED', data: expect.objectContaining({ processed: true }) }); ``` ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) Isolate Tests Use fresh engine instances and mock dependencies for each test. Test Edge Cases Verify [error handling](../10-error-handling/), timeouts, and [retry behavior](../15-retry-policies/). Maintain Clarity Use descriptive test names and document complex scenarios. ## [Testing Utilities Reference](#testing-utilities-reference) [Section titled “Testing Utilities Reference”](#testing-utilities-reference) useWorkflowEngine Creates an isolated test environment for workflows. waitFor Async assertions for workflow states and transitions. expect Extended assertions for workflow testing. task Test-specific context and utilities. # Parallel Processing in Workflows > Learn how to execute multiple workflow steps concurrently for improved performance. # [Parallel Processing](#parallel-processing) [Section titled “Parallel Processing”](#parallel-processing) IdentityFlow allows you to process multiple items or execute several distinct tasks concurrently within a workflow. This is particularly useful for bulk operations or when independent branches of logic can run simultaneously, significantly improving overall workflow performance. **Example: Bulk Item Processing** ```typescript export default defineWorkflow('bulk-processing', async (flow) => { const { items } = flow.params; // Use Promise.all to wait for all concurrent tasks to complete const results = await Promise.all( items.map((item) => // Each flow.do() call within the map can execute in parallel flow.do(`process item ${item.id}`, async () => { return processItem(item); // Your custom function to process an individual item }), ), ); // The 'results' array will contain the outcome of each processed item return results; }); ``` **Considerations for Parallel Processing:** * **Error Handling**: When using `Promise.all`, if any of the parallel tasks reject, `Promise.all` itself will reject immediately. Consider using `Promise.allSettled` if you need to know the outcome of all tasks, even if some fail. See [Error Handling](../10-error-handling/) for more strategies. * **Resource Limits**: Be mindful of external resource limits (e.g., database connections, API rate limits) when running many tasks in parallel. See [Resource Management](../55-resource-management/). * **Idempotency**: Ensure that concurrently running tasks are idempotent if there’s a chance of retries. Parallel processing is a powerful feature for optimizing workflows that involve multiple independent operations. [See Sub-Workflows for another way to manage complex processes.](../60-sub-workflows/) # Advanced Resource Management > Efficiently manage external resources and connections within your workflows. # [Advanced Resource Management](#advanced-resource-management) [Section titled “Advanced Resource Management”](#advanced-resource-management) Workflows often interact with external resources such as database connections, API clients, or file handles. Proper management of these resources—acquiring them when needed and releasing them promptly—is crucial for efficiency and stability, especially in long-running or high-concurrency scenarios. IdentityFlow’s `flow.use()` provides a basic pattern for lazy-loaded, cached resources. For more explicit control, you can use `try...finally` blocks. **Example: Explicit Acquire and Release** This pattern is useful when a resource needs to be explicitly acquired at the start of a specific operation and guaranteed to be released even if errors occur. ```typescript // Assume these are functions to manage a hypothetical external resource // async function acquireResource(): Promise { ... } // async function useResource(resource: ResourceType, params: any): Promise { ... } // async function releaseResource(resource: ResourceType): Promise { ... } export default defineWorkflow('resource-management-explicit', async (flow) => { let resource; try { resource = await flow.do('acquire resource', async () => { return acquireResource(); }); await flow.do('use resource', async () => { // Perform operations using the acquired resource return useResource(resource, flow.params.data); }); // ... more steps using the resource } catch (error) { flow.error('Error during resource usage', { error: error.message }); // Depending on the error, you might re-throw or handle it throw error; } finally { // Ensure the resource is released even if errors occurred in the try block if (resource) { await flow.do('release resource', async () => { return releaseResource(resource); }); } } return { status: 'Resource operations completed' }; }); ``` **Key Considerations for Resource Management:** * **Lazy Loading with `flow.use()`**: For resources that can be initialized once and reused throughout the workflow (or parts of it), `flow.use()` is often the preferred method due to its simplicity and automatic caching. (Refer to the [Simple API feature page](../features/30-simple-api/) for `flow.use()` examples). * **Explicit Control**: Use `try...finally` for resources that have a well-defined lifecycle within a specific part of the workflow and must be cleaned up. * **Idempotent Release**: Ensure that release operations are idempotent (can be safely called multiple times) if there’s any chance of them being invoked more than once during error recovery or retries. * **Connection Pooling**: For database connections or similar resources, leverage connection pooling mechanisms provided by client libraries rather than acquiring/releasing connections for every single operation. See [Integrations](../30-integrations/) for client configuration examples. * **Timeouts**: Implement timeouts for acquiring or using resources to prevent workflows from hanging indefinitely. Effective resource management is key to building robust and performant workflows. [Review Parallel Processing techniques.](../50-parallel-processing) # Using Sub-Workflows > Break down complex processes into manageable, reusable sub-workflows. # [Using Sub-Workflows](#using-sub-workflows) [Section titled “Using Sub-Workflows”](#using-sub-workflows) IdentityFlow supports the concept of sub-workflows (or child workflows), allowing you to break down large, complex processes into smaller, more manageable, and often reusable units. A parent workflow can start one or more sub-workflows using [`flow.startWorkflow()`](../80-api-reference/#workflowcontext) and optionally wait for their completion or react to their outcomes. **Example: Order Fulfillment with Payment Sub-Workflow** ```typescript export default defineWorkflow('order-fulfillment', async (flow) => { // Start a sub-workflow to handle payment processing // The parent workflow waits for the sub-workflow to complete const paymentResult = await flow.startWorkflow('process-payment', { params: { amount: flow.params.orderAmount, customerId: flow.params.customerId }, }); // Proceed based on the outcome of the sub-workflow if (paymentResult.status === 'paid') { await flow.startWorkflow('ship-order', { params: { orderId: flow.params.orderId, shippingAddress: flow.params.address }, }); return { overallStatus: 'Order Shipped' }; } else { // See Error Handling guide for more details flow.error('Payment failed for order', { orderId: flow.params.orderId, paymentStatus: paymentResult.status, }); // See [Error Handling](../10-error-handling/) return { overallStatus: 'Order Failed - Payment Issue' }; } }); ``` **Benefits of Using Sub-Workflows:** * **Modularity**: Decompose complex logic into focused, understandable pieces. * **Reusability**: Design generic sub-workflows (e.g., payment processing, notification sending) that can be invoked by multiple parent workflows. * **Isolation**: Failures in a sub-workflow can be handled without necessarily halting the entire parent process, depending on your design. * **Scalability**: Different teams can potentially own and develop different sub-workflows independently. * **Clarity**: Parent workflows become easier to read as they orchestrate higher-level steps. Sub-workflows are a key pattern for building sophisticated and maintainable automated processes. [Learn about Complex State Management for more advanced scenarios.](../65-complex-state-management/) # Complex State Management > Techniques for handling intricate state transitions and conditional logic in workflows. Workflows often involve managing complex states, conditional logic, and waiting for various external events or timeouts. IdentityFlow provides patterns to handle these scenarios effectively. * Approval Process with Timeout This example demonstrates an approval process that waits for an external response but also includes a timeout mechanism using `Promise.race`. ```typescript export default defineWorkflow('approval-process', async (flow) => { const request = await flow.do('submit request', async () => { return createRequest(flow.params); // Your function to create/submit the request }); // Use Promise.race to wait for either an approval response or a timeout const result = await Promise.race([ flow.request('approval response', async ({ token }) => { // This step pauses the workflow, providing a token. // An external system uses this token to send a response and resume the workflow. return { requestId: request.id, token }; }), flow.sleep('escalation timeout', '48 hours', { timedOut: true }), // If sleep resolves first, it's a timeout ]); if (result.timedOut) { // Check if the timeout occurred return flow.do('handle timeout', async () => { return handleEscalation(request.id); // Your escalation logic }); } // If not a timeout, 'result' will be the data sent from the external system // (assuming the external system sends an object with a 'status' field) if (result.status === 'approved') { return flow.do('process approval', async () => { return processApproval(request.id); // Your approval logic }); } return flow.do('handle rejection', async () => { return handleRejection(request.id, result.reason); // Your rejection logic }); }); ``` * Multi-Stage Process This example illustrates a workflow with multiple distinct stages, including parallel processing within a stage. ```typescript export default defineWorkflow('multi-stage-process', async (flow) => { // Stage 1: Initial Processing const initialData = await flow.do('initial processing', async () => { return processInitialData(flow.params); }); // Stage 2: Parallel Data Validation and Enrichment const [validationResult, enrichedData] = await Promise.all([ flow.do('validate data', async () => validateData(initialData)), flow.do('enrich data', async () => enrichData(initialData)), ]); // Stage 3: Conditional Processing Based on Validation if (validationResult.status === 'valid') { await flow.do('process valid data', async () => { return processValidData(enrichedData); }); return { finalStatus: 'Processed Successfully' }; } else { await flow.do('handle invalid data', async () => { return handleInvalidData(validationResult.errors); }); return { finalStatus: 'Failed - Invalid Data', errors: validationResult.errors }; } }); ``` **Key Techniques:** * [`flow.request()`](../80-api-reference/#workflowcontext): Pause a workflow to wait for external input, providing a token for correlation. * [`flow.sleep()`](../80-api-reference/#workflowcontext): Introduce delays or implement timeouts. * `Promise.race()`: Manage competing events, such as an external response versus a timeout. * [`Promise.all()`](../50-parallel-processing/) / `Promise.allSettled()`: Handle parallel execution of steps within a stage. See [Parallel Processing](../50-parallel-processing/). * Conditional Logic\*\*: Standard JavaScript `if/else` or `switch` statements to direct flow based on data or step outcomes. By combining these patterns, you can model sophisticated stateful interactions within your IdentityFlow workflows. [Explore Custom Step Types for more reusable logic.](../70-custom-step-types) # Custom Step Types > Define reusable step types with specific behavior and validation for cleaner workflows. For common operations or complex logic that you want to reuse across multiple workflows or steps, you can define custom step functions. These functions can encapsulate specific behaviors, include their own validation, and contribute to cleaner, more modular workflow definitions. **Example: Generic Validation Step Function** This example creates a higher-order function `validateStep` that takes a schema (e.g., a Valibot schema) and returns an asynchronous function suitable for use as a `flow.do()` task. This task will parse and validate input data according to the provided schema. See the [Data Validation](../20-data-validation/) page for more on integrating validation schemas. ```typescript import { type Schema } from '@identity-flow/sdk/valibot'; // Assuming Valibot or similar schema type // TODO: use ValidationError from @identity-flow/sdk // Define a custom error for validation failures if needed class ValidationError extends Error { constructor(public details: any) { super('Validation failed'); this.name = 'ValidationError'; } } const validateStep = (schema: Schema) => async (data: unknown): Promise => { // Replace with your chosen validation library's parsing logic // For example, using Valibot: // import { parse } from 'valibot'; // const result = parse(schema, data); // return result; // Placeholder for generic schema validation logic: const result = await schema.parse(data); // Example with a parse method on schema if (!result.success) { // Adjust based on your schema library's result structure throw new ValidationError(result.error); } return result.data; }; // Example usage in a workflow: export default defineWorkflow('validated-process', async (flow) => { // Define an input schema (e.g., using Valibot) // const InputSchema = v.object({ id: v.string(), quantity: v.number() }); try { // Use the custom validation step // const data = await flow.do('validate input', validateStep(InputSchema))(flow.params); // For demonstration, assuming flow.params is directly passed and InputSchema is defined elsewhere const validatedData = await flow.do( 'validate input', validateStep(flow.params.InputSchema), )(flow.params.dataToValidate); // Continue with validatedData flow.log('Input validated successfully', validatedData); // ... further processing } catch (error) { if (error instanceof ValidationError) { flow.error('Input validation failed', { details: error.details }); } else { flow.error('An unexpected error occurred during validation', { message: error.message }); } throw error; // Fail the workflow } }); ``` **Benefits:** * **Reusability**: Write complex logic once and use it in many places. * **Abstraction**: Hide implementation details of common tasks. * **Testability**: Custom step functions can often be unit-tested in isolation. * **Clarity**: Workflows become easier to read by composing them from well-defined custom steps. [Consider Dynamic Workflows for adapting behavior at runtime.](../75-dynamic-workflows) # Dynamic Workflows > Build workflows that adapt their behavior based on runtime conditions or external configurations. IdentityFlow allows for the creation of dynamic workflows whose steps or behavior can be determined at runtime. This is useful for scenarios where the process itself needs to adapt based on input parameters, external configurations, or data retrieved during the workflow’s execution. **Example: Loading Workflow Steps Dynamically** In this example, the specific steps to execute are loaded based on a `processType` parameter. This allows a single workflow definition to handle various types of processes, each with its own sequence of operations. ```typescript // Assume loadWorkflowSteps is a function that fetches step configurations // based on a process type, e.g., from a database or configuration file. // interface StepConfig { name: string; action: string; params?: any; } // async function loadWorkflowSteps(processType: string): Promise { ... } // Assume executeStep is a function that can execute a step based on its configuration. // async function executeStep(stepConfig: StepConfig): Promise { ... } export default defineWorkflow('dynamic-process', async (flow) => { const steps = await flow.do('load steps', async () => { // Determine the type of process from parameters or an initial step const processType = flow.params.processType || 'default-process'; return loadWorkflowSteps(processType); }); // Iterate over the dynamically loaded steps and execute them for (const step of steps) { await flow.do(step.name, async () => { // Pass step-specific parameters if needed return executeStep(step.action, step.params); }); } return { status: 'Dynamic process completed' }; }); ``` **Use Cases for Dynamic Workflows:** * **Configurable Processes**: Allow administrators or users to customize workflow behavior without code changes. * **Data-Driven Workflows**: Adapt the flow based on the nature of the data being processed. * **Plugin Architectures**: Enable new steps or behaviors to be added as plugins. * **A/B Testing Flows**: Dynamically route execution through different paths for testing purposes. **Considerations:** * **Complexity**: Dynamic workflows can be more complex to design, debug, and maintain than static ones. * **Validation**: Ensure that dynamically loaded steps or configurations are valid and secure. * **Observability**: Pay extra attention to logging and tracing to understand the runtime behavior of dynamic flows. See [Observability](../40-observability/). While powerful, use dynamic workflows judiciously where the flexibility they offer outweighs the potential increase in complexity. [Explore Resource Management for handling external connections.](../55-resource-management/) # API Reference > Complete documentation of every method, property, and interface Core API Essential methods and types for defining and running workflows. Step Options Configuration options for individual workflow steps. Retry Policies Customizable retry behavior for handling failures. ## [Core API](#core-api) [Section titled “Core API”](#core-api) Learn more in the [Developing Workflows guide](../guides/workflows/). ### [defineWorkflow](#defineworkflow) [Section titled “defineWorkflow”](#defineworkflow) Creates a new workflow definition: ```typescript defineWorkflow( name: string, options?: WorkflowOptions, fn: WorkflowFunction ): Workflow ``` * Basic Usage ```typescript export default defineWorkflow('hello-world', async (flow) => { await flow.do('greet', async () => { console.log('Hello, World!'); }); }); ``` * With Options ```typescript export default defineWorkflow('process-order', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async (flow) => { await flow.do('process', async () => { return processOrder(flow.params); }); }); ``` ### [WorkflowContext](#workflowcontext) [Section titled “WorkflowContext”](#workflowcontext) The context object passed to workflow functions: [params ](../20-data-validation/)Input parameters passed to the workflow. [do() ](../guides/30-defining-workflows/)Execute a workflow step. [use() ](../55-resource-management/)Access a binding (external service). [sleep() ](../65-complex-state-management/)Pause workflow execution. ## [Step Options](#step-options) [Section titled “Step Options”](#step-options) ```typescript interface StepOptions { // Number of retry attempts and strategy retries?: RetryPolicy; // Maximum execution time timeout?: string | number; // Unique key for idempotency idempotencyKey?: string; // Step-level validation schema?: Schema; // Circuit breaker configuration circuitBreaker?: CircuitBreakerOptions; // Result caching cache?: CacheOptions; } ``` ### [RetryPolicy](#retrypolicy) [Section titled “RetryPolicy”](#retrypolicy) See [Retry Policies](../15-retry-policies/) for detailed configuration. ```typescript interface RetryPolicy { // Maximum retry attempts limit: number; // Delay between retries delay: string | number; // Retry delay pattern backoff: 'constant' | 'exponential' | 'linear'; } ``` ### [CircuitBreakerOptions](#circuitbreakeroptions) [Section titled “CircuitBreakerOptions”](#circuitbreakeroptions) ```typescript interface CircuitBreakerOptions { // Failures before opening failureThreshold: number; // Time before retry resetTimeout: string | number; } ``` ### [CacheOptions](#cacheoptions) [Section titled “CacheOptions”](#cacheoptions) ```typescript interface CacheOptions { // Cache duration ttl: string | number; // Cache key key: string; } ``` ## [Usage Examples](#usage-examples) [Section titled “Usage Examples”](#usage-examples) * Step Execution ```typescript await flow.do('process payment', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, timeout: '5 minutes', idempotencyKey: orderId }, async () => { return processPayment(flow.params); }); ``` * Parallel Steps See [Parallel Processing](../50-parallel-processing/) for more details. ```typescript const [payment, inventory] = await Promise.all([ flow.do('process payment', async () => { return processPayment(flow.params); }), flow.do('check inventory', async () => { return checkInventory(flow.params); }) ]); ``` * External Services ```typescript const client = flow.use(GraphqlClient); await flow.do('fetch data', async () => { return client.request(` query GetUser($id: ID!) { user(id: $id) { id name } } `, { id: flow.params.userId }); }); ``` ## [Error Handling](#error-handling) [Section titled “Error Handling”](#error-handling) See the main [Error Handling](../10-error-handling/) guide for comprehensive strategies. ### [NonRetryableError](#nonretryableerror) [Section titled “NonRetryableError”](#nonretryableerror) Use to indicate permanent failures that shouldn’t be retried: ```typescript await flow.do('process payment', async () => { try { return await processPayment(flow.params); } catch (error) { if (error.code === 'INVALID_CARD') { throw new NonRetryableError(error); } throw error; } }); ``` ### [ValidationError](#validationerror) [Section titled “ValidationError”](#validationerror) Thrown when input validation fails. See [Data Validation](../20-data-validation/). ```typescript await flow.do('validate input', { schema: InputSchema }, async (input) => { // Input is validated against schema return processInput(input); }); ``` ## [Telemetry](#telemetry) [Section titled “Telemetry”](#telemetry) Explore [Observability Features](../features/70-opentelemetry-integration/) and the [Observability Deep Dive](../40-observability/). ### [Tracing](#tracing) [Section titled “Tracing”](#tracing) ```typescript await flow.do('process order', async ({ span }) => { // Add custom attributes span.setAttribute('order.id', flow.params.orderId); // Create nested span const validateSpan = span.createSpan('validate'); try { await validateOrder(flow.params); validateSpan.setStatus({ code: SpanStatusCode.OK }); } catch (error) { validateSpan.setStatus({ code: SpanStatusCode.ERROR, message: error.message }); throw error; } finally { validateSpan.end(); } }); ``` ### [Metrics](#metrics) [Section titled “Metrics”](#metrics) ```typescript // Note: Collecting and exporting metrics requires integrating with an // OpenTelemetry collector and backend system separately. // See OpenTelemetry documentation for setup details. ``` ## [Type Definitions](#type-definitions) [Section titled “Type Definitions”](#type-definitions) ### [Complete Interface Reference](#complete-interface-reference) [Section titled “Complete Interface Reference”](#complete-interface-reference) ```typescript interface Workflow { id: string; name: string; version: string; options?: WorkflowOptions; run(params: TParams): Promise; } interface WorkflowOptions { retries?: RetryPolicy; timeout?: string | number; schema?: Schema; recovery?: RecoveryHandlers; } interface WorkflowContext { params: TParams; instance: WorkflowInstance; definition: WorkflowDefinition; do(name: string, options?: StepOptions, fn: StepFunction): Promise; use(binding: Binding): T; sleep(name: string, duration: string | number): Promise; } interface StepFunction { (context: StepContext): Promise; } interface StepContext { span: Span; use(binding: Binding): T; log: Logger; } ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Workflow Rules Learn essential [guidelines](../05-workflow-rules/) for building reliable workflows. Error Handling Master [error handling](../10-error-handling/) strategies. TypeScript Explore [TypeScript integration](../25-typescript-support/) for type-safe development. # Workflow Examples > Learn by building real-world workflows, from basic patterns to advanced scenarios This section provides practical, step-by-step examples of common workflow patterns implemented using IdentityFlow. [User Registration ](./user-registration/)Handle user sign-up with input validation, account creation, and notifications. [Order Processing ](./order-processing/)Process e-commerce orders with parallel steps for payment and inventory, plus error handling. [Document Approval ](./document-approval/)Implement an approval process involving multiple reviewers, timeouts, and conditional logic. ## [Explore Further](#explore-further) [Section titled “Explore Further”](#explore-further) [Core Workflow Activities ](../guides/40-workflow-activities/)Understand the building blocks like \`flow\.do\`, \`flow\.sleep\`, and \`flow\.dialog\`. [Deep Dive Topics ](../deep-dive/)Explore advanced concepts like Workflow Rules, Testing, and Observability. # Document Approval Workflow > An example of a document approval workflow featuring multiple reviewers, timeouts, and conditional logic. This example illustrates how to implement a document approval workflow that involves multiple approvers, waits for their responses with a timeout, and processes the outcomes. 1. **Define Workflow and Input** The workflow will take a document ID and a list of approver IDs as input. ```typescript import * as v from '@identity-flow/sdk/valibot'; import { NonRetryableError, defineWorkflow } from '@identity-flow/sdk'; const ApprovalInputSchema = v.object({ documentId: v.string(), approverIds: v.array(v.string([v.minLength(1)])), // You might also include the document content or a reference to it }); export default defineWorkflow( 'document-approval-example', { schema: ApprovalInputSchema }, async (flow) => { // Workflow steps will go here }, ); ``` 2. **Submit Document & Request Approvals (Parallel)** The first step might be to formally log the document submission. Then, trigger approval requests to all listed approvers concurrently using `Promise.all` with `flow.dialog` for each. ```typescript // Inside the async (flow) => { ... } await flow.do('log document submission', async () => { // Log or update document status to 'Pending Approval' logDocumentEvent(flow.params.documentId, 'SUBMITTED_FOR_APPROVAL'); return { submissionLogged: true }; }); flow.log('Requesting approvals for document:', flow.params.documentId); const approvalDialogPromises = flow.params.approverIds.map((approverId) => flow.dialog( `request approval from ${approverId} for doc ${flow.params.documentId}`, { // Schema for the expected response from each approver schema: v.object({ approved: v.boolean(), comments: v.optional(v.string()) }), }, ({ token }) => ({ params: { form: 'document-approval-form', // UI identifier documentId: flow.params.documentId, approverId, token, // Token for this specific approver's dialog }, assignees: [approverId], // Assign this dialog to the specific approver message: `Approval required for document ${flow.params.documentId}`, // Individual dialog timeout (optional, overall timeout handled by Promise.race) }), ), ); ``` 3. **Wait for All Responses with Overall Timeout** Use `Promise.race` to wait for either all individual dialogs (`Promise.all(approvalDialogPromises)`) to complete or an overall timeout (`flow.sleep`) to occur. ```typescript // Inside the async (flow) => { ... } flow.log('Waiting for approvals with a 24-hour timeout...'); const approvalOutcome = await Promise.race([ Promise.all(approvalDialogPromises), flow.sleep('overall approval timeout', '24 hours').then(() => 'TIMEOUT'), // Distinguish timeout ]); ``` 4. **Process Approval Outcome** Check if the outcome was a timeout or the collected approval responses. Based on the responses, mark the document as approved or rejected. ```typescript // Inside the async (flow) => { ... } if (approvalOutcome === 'TIMEOUT') { flow.warn('Approval timed out for document:', flow.params.documentId); await flow.do('handle approval timeout', async () => { logDocumentEvent(flow.params.documentId, 'APPROVAL_TIMEOUT'); // Notify admin or originator about the timeout sendTimeoutNotification(flow.params.documentId); }); return { documentId: flow.params.documentId, status: 'APPROVAL_TIMEOUT' }; } else { // approvalOutcome is an array of responses if not TIMEOUT const allApproved = approvalOutcome.every(response => response.approved); flow.log( `Approvals received for document ${flow.params.documentId}. Overall status: ${allApproved ? 'APPROVED' : 'REJECTED'}` ); if (allApproved) { await flow.do('mark document approved', async () => { updateDocumentStatus(flow.params.documentId, 'APPROVED'); // Notify originator of approval sendApprovalNotification(flow.params.documentId, 'APPROVED', approvalOutcome); }); return { documentId: flow.params.documentId, status: 'APPROVED', responses: approvalOutcome }; } else { await flow.do('mark document rejected', async () => { updateDocumentStatus(flow.params.documentId, 'REJECTED'); // Notify originator of rejection sendApprovalNotification(flow.params.documentId, 'REJECTED', approvalOutcome); }); return { documentId: flow.params.documentId, status: 'REJECTED', responses: approvalOutcome }; } } ``` ### [Full Example Code](#full-example-code) [Section titled “Full Example Code”](#full-example-code) ```typescript import * as v from '@identity-flow/sdk/valibot'; import { NonRetryableError, defineWorkflow } from '@identity-flow/sdk'; // --- Mock external functions for demonstration --- async function logDocumentEvent(documentId: string, eventType: string) { console.log(`Logging event for document ${documentId}: ${eventType}`); } async function updateDocumentStatus(documentId: string, status: string) { console.log(`Updating document ${documentId} status to: ${status}`); } async function sendTimeoutNotification(documentId: string) { console.log(`Sending approval timeout notification for document ${documentId}`); } async function sendApprovalNotification(documentId: string, status: string, responses: any) { console.log( `Sending approval notification for document ${documentId}. Status: ${status}. Responses:`, responses, ); } // --- End mock functions --- const ApprovalInputSchema = v.object({ documentId: v.string(), approverIds: v.array(v.string([v.minLength(1)])), }); export default defineWorkflow( 'document-approval-example', { schema: ApprovalInputSchema }, async (flow) => { flow.log( 'Starting document approval workflow for:', flow.params.documentId, 'Approvers:', flow.params.approverIds, ); await flow.do('log document submission', async () => { logDocumentEvent(flow.params.documentId, 'SUBMITTED_FOR_APPROVAL'); return { submissionLogged: true }; }); flow.log('Requesting approvals for document:', flow.params.documentId); const approvalDialogPromises = flow.params.approverIds.map((approverId) => flow.dialog( `request approval from ${approverId} for doc ${flow.params.documentId}`, { schema: v.object({ approved: v.boolean(), comments: v.optional(v.string()) }) }, ({ token }) => ({ params: { form: 'document-approval-form', documentId: flow.params.documentId, approverId, token, }, assignees: [approverId], message: `Approval required for document ${flow.params.documentId}`, }), ), ); flow.log('Waiting for approvals with a 24-hour timeout...'); const approvalOutcome = await Promise.race([ Promise.all(approvalDialogPromises), flow.sleep('overall approval timeout', '5 seconds').then(() => 'TIMEOUT'), // Shortened for demo ]); if (approvalOutcome === 'TIMEOUT') { flow.warn('Approval timed out for document:', flow.params.documentId); await flow.do('handle approval timeout', async () => { logDocumentEvent(flow.params.documentId, 'APPROVAL_TIMEOUT'); sendTimeoutNotification(flow.params.documentId); }); return { documentId: flow.params.documentId, status: 'APPROVAL_TIMEOUT' }; } else { const allApproved = approvalOutcome.every((response) => response.approved); flow.log( `Approvals received for document ${flow.params.documentId}. Overall status: ${allApproved ? 'APPROVED' : 'REJECTED'}`, ); if (allApproved) { await flow.do('mark document approved', async () => { updateDocumentStatus(flow.params.documentId, 'APPROVED'); sendApprovalNotification(flow.params.documentId, 'APPROVED', approvalOutcome); }); return { documentId: flow.params.documentId, status: 'APPROVED', responses: approvalOutcome, }; } else { await flow.do('mark document rejected', async () => { updateDocumentStatus(flow.params.documentId, 'REJECTED'); sendApprovalNotification(flow.params.documentId, 'REJECTED', approvalOutcome); }); return { documentId: flow.params.documentId, status: 'REJECTED', responses: approvalOutcome, }; } } }, ); ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) [User Registration Example ](../user-registration/) [Core Workflow Activities ](../../guides/40-workflow-activities/) [Complex State Management ](../../deep-dive/60-complex-state-management/) # Order Processing Workflow > A step-by-step example of an order processing workflow with parallel execution and error handling. This example shows how to build an e-commerce order processing workflow, including validation, parallel payment and inventory checks, shipping, and confirmation. 1. **Define Workflow with Retry Policy** Start by defining the workflow and a default retry policy for its steps. ```typescript import * as v from '@identity-flow/sdk/valibot'; import { NonRetryableError, defineWorkflow } from '@identity-flow/sdk'; // Assuming Valibot for schema // Define an input schema for the order const OrderInputSchema = v.object({ orderId: v.string(), customerId: v.string(), items: v.array(v.object({ productId: v.string(), quantity: v.number() })), totalAmount: v.number(), shippingAddress: v.object({ street: v.string(), city: v.string(), zip: v.string() }), }); export default defineWorkflow( 'order-processing-example', { schema: OrderInputSchema, retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, }, async (flow) => { // Workflow steps will go here }, ); ``` 2. **Validate Order Details** Perform initial validation on the order. `flow.params` is already validated by the workflow schema, but you might have custom business rule checks. ```typescript // Inside the async (flow) => { ... } const validatedOrder = await flow.do('validate order business rules', async () => { if (!flow.params.items?.length) { // Throw a NonRetryableError if it's a business rule violation that shouldn't be retried. throw new NonRetryableError('Order must contain items'); } // Perform other business rule validations if necessary... console.log('Order business rules validated for:', flow.params.orderId); return flow.params; // Return the validated (or transformed) order data }); ``` 3. **Process Payment & Check Inventory (Parallel)** Use `Promise.all` to execute payment processing and inventory checking concurrently for efficiency. ```typescript // Inside the async (flow) => { ... } const [paymentResult, inventoryStatus] = await Promise.all([ flow.do('process payment', async () => { console.log('Processing payment for order:', validatedOrder.orderId); // Replace with actual payment gateway integration return processPayment(validatedOrder.customerId, validatedOrder.totalAmount); }), flow.do('check inventory', async () => { console.log('Checking inventory for order:', validatedOrder.orderId); // Replace with actual inventory check logic return checkInventoryAvailability(validatedOrder.items); }), ]); // Handle payment failure if (!paymentResult.success) { flow.error('Payment failed for order:', validatedOrder.orderId, paymentResult.failureReason); // Optionally, trigger a compensation logic, like voiding an authorization throw new NonRetryableError(`Payment failed: ${paymentResult.failureReason}`); } ``` 4. **Arrange Shipping** If payment and inventory checks are successful, proceed to arrange shipping. ```typescript // Inside the async (flow) => { ... } await flow.do('arrange shipping', async () => { if (!inventoryStatus.available) { // Handle out-of-stock scenario - this might involve notifications or backorder logic flow.warn('Items not available for order:', validatedOrder.orderId); throw new NonRetryableError('Items not available for shipping.'); } console.log('Arranging shipping for order:', validatedOrder.orderId); // Replace with actual shipping arrangement logic return arrangeShipment(validatedOrder.shippingAddress, validatedOrder.items); }); ``` 5. **Send Order Confirmation** Finally, send a confirmation to the customer. ```typescript // Inside the async (flow) => { ... } await flow.do('send order confirmation', async () => { console.log('Sending confirmation for order:', validatedOrder.orderId); // Replace with actual notification logic await sendConfirmationEmail(validatedOrder.customerId, validatedOrder.orderId, { paymentResult, inventoryStatus }); }); return { orderId: validatedOrder.orderId, status: 'ORDER_PROCESSING_COMPLETED', paymentId: paymentResult.transactionId }; ``` ### [Full Example Code](#full-example-code) [Section titled “Full Example Code”](#full-example-code) ```typescript import * as v from '@identity-flow/sdk/valibot'; import { NonRetryableError, defineWorkflow } from '@identity-flow/sdk'; // --- Mock external functions for demonstration --- async function processPayment(customerId: string, amount: number) { console.log(`Processing payment of ${amount} for customer ${customerId}`); // Simulate payment gateway call if (Math.random() < 0.1) return { success: false, failureReason: 'Insufficient funds' }; return { success: true, transactionId: 'txn-' + Math.random().toString(36).substring(7) }; } async function checkInventoryAvailability(items: any[]) { console.log( 'Checking inventory for items:', items.map((i) => i.productId), ); // Simulate inventory check return { available: Math.random() > 0.05 }; // 95% chance items are available } async function arrangeShipment(address: any, items: any[]) { console.log('Arranging shipment to:', address.city, 'for items:', items.length); } async function sendConfirmationEmail(customerId: string, orderId: string, details: any) { console.log(`Sending order confirmation for ${orderId} to customer ${customerId}`); } // --- End mock functions --- const OrderInputSchema = v.object({ orderId: v.string(), customerId: v.string(), items: v.array(v.object({ productId: v.string(), quantity: v.number() })), totalAmount: v.number(), shippingAddress: v.object({ street: v.string(), city: v.string(), zip: v.string() }), }); export default defineWorkflow( 'order-processing-example', { schema: OrderInputSchema, retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async (flow) => { flow.log('Starting order processing for:', flow.params.orderId); const validatedOrder = await flow.do('validate order business rules', async () => { if (!flow.params.items?.length) { throw new NonRetryableError('Order must contain items'); } flow.log('Order business rules validated for:', flow.params.orderId); return flow.params; }); const [paymentResult, inventoryStatus] = await Promise.all([ flow.do('process payment', async () => { flow.log('Processing payment for order:', validatedOrder.orderId); return processPayment(validatedOrder.customerId, validatedOrder.totalAmount); }), flow.do('check inventory', async () => { flow.log('Checking inventory for order:', validatedOrder.orderId); return checkInventoryAvailability(validatedOrder.items); }), ]); if (!paymentResult.success) { flow.error('Payment failed for order:', validatedOrder.orderId, paymentResult.failureReason); throw new NonRetryableError(`Payment failed: ${paymentResult.failureReason}`); } flow.log( 'Payment successful for order:', validatedOrder.orderId, 'TxnID:', paymentResult.transactionId, ); await flow.do('arrange shipping', async () => { if (!inventoryStatus.available) { flow.warn('Items not available for order:', validatedOrder.orderId); throw new NonRetryableError('Items not available for shipping.'); } flow.log('Arranging shipping for order:', validatedOrder.orderId); return arrangeShipment(validatedOrder.shippingAddress, validatedOrder.items); }); flow.log('Shipping arranged for order:', validatedOrder.orderId); await flow.do('send order confirmation', async () => { flow.log('Sending confirmation for order:', validatedOrder.orderId); await sendConfirmationEmail(validatedOrder.customerId, validatedOrder.orderId, { paymentResult, inventoryStatus, }); }); flow.log('Order confirmation sent for:', validatedOrder.orderId); return { orderId: validatedOrder.orderId, status: 'ORDER_PROCESSING_COMPLETED', paymentId: paymentResult.transactionId, }; }, ); ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) [Document Approval Example ](../document-approval/)Learn about workflows with timeouts and multiple reviewers. [Parallel Processing ](../../deep-dive/50-parallel-processing/)Dive deeper into concurrent step execution. [Error Handling Guide ](../../guides/60-error-handling-retries/)Master strategies for managing failures. # User Registration Workflow > A step-by-step example of a user registration workflow with validation and notifications. This example demonstrates how to create a workflow that handles user registration, including input validation, account creation, and sending a welcome notification. 1. **Define Input Schema & Workflow** First, define the expected input for user registration using a schema (e.g., Valibot). Then, start the workflow definition, associating it with this schema. ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; const UserSchema = v.object({ email: v.string([v.email()]) password: v.string([v.minLength(8)]), name: v.string(), }); export default defineWorkflow('user-registration', { schema: UserSchema }, async (flow) => { // Workflow steps will go here }); ``` 2. **Validate User Data (Implicit)** IdentityFlow automatically validates `flow.params` against the `UserSchema` at the start. If validation fails, the workflow won’t proceed. For explicit validation or transformation as a step: ```typescript // Inside the async (flow) => { ... } const validatedUser = await flow.do('validate user input', async () => { // Assuming validateUserData performs any additional checks or transformations // flow.params is already schema-validated at this point return validateUserData(flow.params); }); ``` *Note: `validateUserData` would be your custom validation/transformation logic if needed beyond the initial schema.* 3. **Create User Account** Perform the actual user creation, perhaps an API call. Configure retries for transient network issues. ```typescript // Inside the async (flow) => { ... } const user = await flow.do( 'create user account', { retries: { limit: 3, backoff: 'exponential' } }, async () => { // Replace with your actual user creation logic (e.g., API call) return createUserInDatabase(validatedUser); }, ); ``` 4. **Send Welcome Email** After successful account creation, send a welcome email. This step also includes retries. ```typescript // Inside the async (flow) => { ... } await flow.do( 'send welcome email', { retries: { limit: 5, delay: '30 seconds' } }, async () => { // Replace with your actual email sending logic await sendNotificationEmail(user.email, 'welcome'); }, ); ``` 5. **Complete the Workflow** Return a result indicating the outcome. ```typescript // Inside the async (flow) => { ... } return { userId: user.id, status: 'REGISTRATION_COMPLETED' }; ``` ### [Full Example Code](#full-example-code) [Section titled “Full Example Code”](#full-example-code) ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; // --- Mock external functions for demonstration --- async function validateUserData(data: any) { console.log('Validating user data:', data); // In a real scenario, add more complex validation if needed return data; // Assuming basic schema validation is enough } async function createUserInDatabase(data: any) { console.log('Creating user in database:', data); return { id: 'user-' + Math.random().toString(36).substring(7), email: data.email }; } async function sendNotificationEmail(email: string, template: string) { console.log(`Sending ${template} email to ${email}`); // Simulate email sending that might fail occasionally if (Math.random() < 0.1) throw new Error('Simulated email send failure'); } // --- End mock functions --- const UserSchema = v.object({ email: v.string([v.email()]) password: v.string([v.minLength(8)]), name: v.string(), }); export default defineWorkflow('user-registration-example', { schema: UserSchema }, async (flow) => { flow.log('Starting user registration workflow for:', flow.params.email); const validatedUser = await flow.do('validate user input', async () => { return validateUserData(flow.params); }); const user = await flow.do( 'create user account', { retries: { limit: 3, backoff: 'exponential' } }, async () => { return createUserInDatabase(validatedUser); } ); flow.log('User account created:', user.id); await flow.do('send welcome email', { retries: { limit: 5, delay: '30 seconds' } }, async () => { await sendNotificationEmail(user.email, 'welcome'); } ); flow.log('Welcome email step processed for:', user.email); return { userId: user.id, status: 'REGISTRATION_COMPLETED' }; }); ``` ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Explore other examples or dive deeper into specific concepts: [Order Processing Example ](../order-processing/)See how to handle e-commerce orders with parallel steps. [Error Handling Guide ](../../guides/60-error-handling-retries/)Learn more about managing failures and retry policies. [Core Workflow Activities ](../../guides/40-workflow-activities/)Understand \`flow\.do\`, \`flow\.sleep\`, and other activities. # IdentityFlow Features > Discover how IdentityFlow can transform your business processes with reliable, observable, and efficient workflow automation. Explore our core capabilities. Reliable Execution Every workflow step is executed exactly once, ensuring consistent business processes. Full Visibility Real-time monitoring and comprehensive audit trails for every workflow. Seamless Integration Connect with your existing systems and third-party services effortlessly. ## [Core Benefits](#core-benefits) [Section titled “Core Benefits”](#core-benefits) ### [Lightweight Durable Execution](#lightweight-durable-execution) [Section titled “Lightweight Durable Execution”](#lightweight-durable-execution) Our engine guarantees that each workflow step is executed exactly one time, ensuring reliability and eliminating duplicate work even in the event of failures. * Guaranteed single execution * Failure recovery built-in * Process consistency guaranteed ### [Robust Retry Mechanisms](#robust-retry-mechanisms) [Section titled “Robust Retry Mechanisms”](#robust-retry-mechanisms) With configurable retry policies—including limits, delays, and backoff strategies—the engine automatically re‑attempts failed steps, overcoming temporary issues without manual intervention. * Smart backoff strategies * Configurable retry limits * Automatic error recovery ### [Fast & Efficient Performance](#fast--efficient-performance) [Section titled “Fast & Efficient Performance”](#fast--efficient-performance) Built with efficiency in mind, the engine delivers high performance through intelligent caching and asynchronous design. * Optimized resource usage * Minimal execution latency * High concurrency support ## [Business Features](#business-features) [Section titled “Business Features”](#business-features) Scheduled Jobs Automate recurring tasks with predefined intervals, ensuring scheduled jobs are executed exactly once per cycle. Time Travel Querying Review historical workflow states at any point in time for auditing and analysis. Transparent Audit Trails Immutable audit trails capture every state change and user interaction. Seamless Integration Connect easily with third-party APIs, databases, and legacy systems. ## [Industry Applications](#industry-applications) [Section titled “Industry Applications”](#industry-applications) ### [Financial Services](#financial-services) [Section titled “Financial Services”](#financial-services) * Payment processing with exactly-once guarantees * Transaction reconciliation workflows * Compliance and audit reporting * Risk assessment processes ### [E-commerce](#e-commerce) [Section titled “E-commerce”](#e-commerce) * Order fulfillment automation * Inventory management * Payment processing * Shipping coordination ### [Healthcare](#healthcare) [Section titled “Healthcare”](#healthcare) * Patient onboarding workflows * Insurance verification * Claims processing * Compliance documentation ### [Manufacturing](#manufacturing) [Section titled “Manufacturing”](#manufacturing) * Supply chain automation * Quality control processes * Inventory tracking * Production scheduling ## [Implementation Benefits](#implementation-benefits) [Section titled “Implementation Benefits”](#implementation-benefits) Reduced Errors Automated workflows eliminate manual errors and ensure consistent execution. Increased Efficiency Streamline operations with automated, parallel processing capabilities. Better Compliance Maintain detailed audit trails and ensure regulatory compliance. Cost Savings Reduce operational costs through automation and improved efficiency. ## [Explore Our Key Capabilities](#explore-our-key-capabilities) [Section titled “Explore Our Key Capabilities”](#explore-our-key-capabilities) Dive deeper into the technical features that power IdentityFlow: [Event-Sourced Architecture ](./10-event-sourcing/)Learn how every state change is recorded as an immutable event. [Deterministic Execution ](./20-deterministic-execution/)Understand how workflows produce predictable results every time. [Simple, Intuitive API ](./simple-api/)Discover our developer-friendly API for straightforward workflow definition. [Resilient Asynchronous Flow ](./resilient-flow/)See how IdentityFlow robustly handles asynchronous operations and errors. [OpenTelemetry Integration ](./70-opentelemetry-integration/)Explore built-in observability with comprehensive tracing and metrics. [Type-Safe Development ](./40-typesafe-development/)Leverage TypeScript for robust and maintainable workflow definitions. [Detailed Error Tracing ](./60-error-tracing/)Quickly debug workflows with comprehensive error traces. ## [Real-World Success](#real-world-success) [Section titled “Real-World Success”](#real-world-success) [Case Studies ](./80-case-studies/)See how IdentityFlow is used in production to solve real business challenges. ## [Getting Started](#getting-started) [Section titled “Getting Started”](#getting-started) Request a Demo See how IdentityFlow can transform your business processes. Contact us to schedule a personalized demo. [Contact Sales](mailto:info@kenoxa.de) # Developer Guides > Practical guides, tutorials, and walkthroughs for building with IdentityFlow. Welcome to the IdentityFlow Developer Guides. Here you’ll find practical walkthroughs and explanations to help you get started and master building automated processes with IdentityFlow. ## [Foundational Guides](#foundational-guides) [Section titled “Foundational Guides”](#foundational-guides) Start here to understand the basics and get your environment set up. [01: Introduction & Core Concepts ](./10-introduction-concepts/)Understand the fundamental principles: event sourcing, determinism, workflows, steps, etc. [02: Quick Start Guide ](./20-quickstart/)Install IdentityFlow and build your first simple workflow quickly. [03: Defining Workflows ](./30-defining-workflows/)Learn the \`defineWorkflow\` function, configuration options, and input validation. ## [Building Workflow Logic](#building-workflow-logic) [Section titled “Building Workflow Logic”](#building-workflow-logic) Dive into the core methods and patterns for implementing your workflow steps. [04: Core Workflow Activities ](./40-workflow-activities/)Master \`flow\.do\`, \`flow\.sleep\`, \`flow\.dialog\`, \`flow\.request\`, and \`flow\.start\`. [05: Dependencies & Logging ](./50-dependencies-logging/)Inject external services using \`flow\.use\` and utilize built-in logging. [06: Error Handling & Retries ](./60-error-handling-retries/)Handle errors gracefully and configure robust retry strategies. ## [Advanced Topics & Examples](#advanced-topics--examples) [Section titled “Advanced Topics & Examples”](#advanced-topics--examples) Explore lifecycle management, best practices, and practical examples. [07: State, Lifecycle & Deployment ](./70-state-lifecycle-deployment/)Understand instance/step states, metadata, and deployment basics. [Workflow Examples ](../examples/)See practical examples for User Registration, Order Processing, and Document Approval. [Deep Dive Topics ](../deep-dive/)Explore advanced concepts like Workflow Rules, Testing, and Observability. # Introduction & Core Concepts > Understand the fundamental concepts and benefits of IdentityFlow workflows. ## [Welcome to IdentityFlow Workflows!](#welcome-to-identityflow-workflows) [Section titled “Welcome to IdentityFlow Workflows!”](#welcome-to-identityflow-workflows) This guide will walk you through creating, configuring, and managing automated processes using the `@identity-flow/sdk`. ### [What are IdentityFlow Workflows?](#what-are-identityflow-workflows) [Section titled “What are IdentityFlow Workflows?”](#what-are-identityflow-workflows) IdentityFlow provides a powerful, code-first engine for defining and executing complex business processes. At its core, it leverages several key principles: * **Event-Sourced**: Every state change, decision, and action within a workflow is recorded as an immutable event. This creates a complete, auditable history of everything that happened. Learn more about [Event Sourcing in our Features section](../../features/10-event-sourcing/). * **Code-First**: Workflows are defined directly in TypeScript using the `@identity-flow/sdk`. This gives you the full power and expressiveness of a familiar programming language to model complex logic. * **Resilient & Durable**: The engine is designed to handle failures gracefully. Thanks to event sourcing, workflows can resume exactly where they left off after interruptions, ensuring exactly-once execution of critical steps. * **Transparent & Observable**: With a full event log and integration with standards like OpenTelemetry, you gain deep visibility into where your workflows are and what they’ve done. * **Deterministic Execution**: Given the same input, workflows always execute the same way. Learn more about [Deterministic Execution in our Features section](../../features/20-deterministic-execution/). ### [Key Benefits](#key-benefits) [Section titled “Key Benefits”](#key-benefits) * **Transparency**: See exactly what happened at each step. * **Durability**: Ensure processes complete reliably, even amidst failures. * **Auditability**: Maintain a complete, immutable log for compliance and debugging. * **Flexibility**: Model complex logic using familiar TypeScript code. * **Testability**: Write unit and integration tests for your workflow logic. ### [Core Concepts Overview](#core-concepts-overview) [Section titled “Core Concepts Overview”](#core-concepts-overview) Workflow Definition A template written in TypeScript using the SDK, describing the steps, logic, and rules of a specific process. Workflow Instance A single, live execution of a Workflow Definition, triggered with specific input parameters. Each instance has its own state and event history. Steps / Activities Individual units of work within a workflow (e.g., calling an API, waiting for input, running custom logic). They are atomic and can be retried independently. Events Immutable records representing every state change or action that occurs during a workflow instance’s lifecycle. This underpins event sourcing. State The current status of a workflow instance (e.g., `ACTIVE`, `SLEEPING`, `PENDING`, `COMPLETED`, `FAILED`). ### [Prerequisites](#prerequisites) [Section titled “Prerequisites”](#prerequisites) To follow the guides, you should have: * Familiarity with TypeScript and modern JavaScript (async/await). * Node.js (version recommended by the project) installed. * Access to an IdentityFlow engine environment (including the GraphQL API, typically at `http://localhost:4000/graphql` for local development). Ready to build your first workflow? Let’s start with the [Quick Start Guide](../20-quickstart/)! ### [Detailed Concepts](#detailed-concepts) [Section titled “Detailed Concepts”](#detailed-concepts) #### [Workflows](#workflows) [Section titled “Workflows”](#workflows) A workflow is a series of steps executed in a specific order, defined as an async function. * Basic Workflow Example ```typescript import { defineWorkflow } from '@identity-flow/sdk'; export default defineWorkflow('order-processing', async (flow) => { // Step 1: Validate Order const validatedOrder = await flow.do('validate order', async () => { return validateOrder(flow.params); }); // Step 2: Process Payment const payment = await flow.do('process payment', async () => { return processPayment(validatedOrder); }); // Step 3: Fulfill Order await flow.do('fulfill order', async () => { return fulfillOrder(payment); }); }); ``` * Workflow With Options ```typescript export default defineWorkflow( 'payment-processing', { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, timeout: '5 minutes' }, async (flow) => { await flow.do('process payment', async () => { return processPayment(flow.params); }); }, ); ``` #### [Steps (Activities)](#steps-activities) [Section titled “Steps (Activities)”](#steps-activities) Steps are atomic units of work within a workflow that can be retried independently. Atomic Operations Each step performs a single, cohesive unit of work. Independent Retries Steps can be retried individually on failure. State Management Steps maintain state through their return values. ##### [Step Configuration Example](#step-configuration-example) [Section titled “Step Configuration Example”](#step-configuration-example) ```typescript await flow.do( 'process payment', { // Retry configuration retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, // Maximum execution time timeout: '5 minutes', // Unique key for idempotency idempotencyKey: orderId, // Step-level validation schema: PaymentSchema, }, async () => { return processPayment(flow.params); }, ); ``` #### [Event Sourcing Explained](#event-sourcing-explained) [Section titled “Event Sourcing Explained”](#event-sourcing-explained) All state changes are recorded as events, providing transparency and audit trails. * Event Recording Example ```typescript export default defineWorkflow('document-approval', async (flow) => { // Each action creates an event await flow.do('submit document', async () => { await submitDocument(flow.params); // Event: DocumentSubmitted }); await flow.do('request approval', async () => { await requestApproval(flow.params); // Event: ApprovalRequested }); }); ``` * Event Replay Example ```typescript // Events can be replayed to reconstruct state const events = await getWorkflowEvents(workflowId); const state = events.reduce((state, event) => { switch (event.type) { case 'DocumentSubmitted': return { ...state, submitted: true }; case 'ApprovalRequested': return { ...state, approvalPending: true }; default: return state; } }, {}); ``` For more details, see the [Event Sourcing feature page](../../features/10-event-sourcing/). #### [Deterministic Execution Explained](#deterministic-execution-explained) [Section titled “Deterministic Execution Explained”](#deterministic-execution-explained) Given the same input, workflows always execute the same way. This ensures predictable outcomes and reliable replay for debugging. * Deterministic Example ```typescript export default defineWorkflow('order-processing', async (flow) => { // Given the same input, this will always execute the same way const validationResult = await flow.do('validate order', async () => { return await validateOrder(flow.params); }); // Conditional paths are predictable based on input if (validationResult.approved) { await flow.do('process order', async () => { await processOrder(flow.params); }); } else { await flow.do('reject order', async () => { await rejectOrder(flow.params, validationResult.reason); }); } }); ``` For more details, see the [Deterministic Execution feature page](../../features/20-deterministic-execution/). *** ### [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Continue your learning journey: [Quick Start Guide ](../20-quickstart/)Get hands-on with IdentityFlow by building your first workflow. [Defining Workflows ](../30-defining-workflows/)Dive deeper into the \`defineWorkflow\` function and its configurations. [Workflow Rules ](../../deep-dive/05-workflow-rules/)Master essential guidelines for building reliable and maintainable workflows. # Quick Start Guide > Get up and running with IdentityFlow in minutes Installation Install IdentityFlow using your preferred package manager. First Workflow Create your first workflow with step-by-step guidance. Best Practices Learn essential patterns for reliable workflows. ## [Installation](#installation) [Section titled “Installation”](#installation) Install IdentityFlow using your preferred package manager: * npm ```sh npm i @identity-flow/sdk ``` * pnpm ```sh pnpm add @identity-flow/sdk ``` * yarn ```sh yarn add @identity-flow/sdk ``` ## [Your First Workflow](#your-first-workflow) [Section titled “Your First Workflow”](#your-first-workflow) 1. Create a new workflow file: ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; // Define input schema const OrderSchema = v.object({ orderId: v.string(), items: v.array(v.object({ productId: v.string(), quantity: v.number([v.positive()]) })), }); ``` 2. Define the workflow with validation: ```typescript export default defineWorkflow( 'order-processing', { // Validate input against schema schema: OrderSchema, // Configure default retry policy retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' }, }, async (flow) => { // Workflow implementation }, ); ``` 3. Add workflow steps: ```typescript // Validate order const validatedOrder = await flow.do('validate order', async () => { return validateOrder(flow.params); }); // Process payment and check inventory in parallel const [payment, inventory] = await Promise.all([ flow.do('process payment', async () => { return processPayment(validatedOrder); }), flow.do('check inventory', async () => { return checkInventory(validatedOrder.items); }), ]); // Fulfill order await flow.do('fulfill order', async () => { return fulfillOrder({ order: validatedOrder, payment, inventory }); }); ``` 4. Add error handling: ```typescript try { await flow.do('process payment', async () => { return processPayment(validatedOrder); }); } catch (error) { if (error.code === 'INSUFFICIENT_FUNDS') { throw new NonRetryableError(error); } throw error; } ``` 5. Return workflow result: ```typescript return { orderId: validatedOrder.orderId, status: 'COMPLETED', payment: payment.id }; ``` ## [Complete Example](#complete-example) [Section titled “Complete Example”](#complete-example) Here’s a complete workflow that processes orders: ```typescript import * as v from '@identity-flow/sdk/valibot'; import { NonRetryableError, defineWorkflow } from '@identity-flow/sdk'; // Input schema const OrderSchema = v.object({ orderId: v.string(), items: v.array(v.object({ productId: v.string(), quantity: v.number([v.positive()]) })), }); // Result type interface OrderResult { orderId: string; status: 'COMPLETED' | 'failed'; payment?: string; } export default defineWorkflow( 'order-processing', { schema: OrderSchema, retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, async (flow) => { try { // Validate order const validatedOrder = await flow.do('validate order', async () => { return validateOrder(flow.params); }); // Process payment and check inventory in parallel const [payment, inventory] = await Promise.all([ flow.do('process payment', async () => { try { return await processPayment(validatedOrder); } catch (error) { if (error.code === 'INSUFFICIENT_FUNDS') { throw new NonRetryableError(error); } throw error; } }), flow.do('check inventory', async () => { return checkInventory(validatedOrder.items); }), ]); // Fulfill order await flow.do('fulfill order', async () => { return fulfillOrder({ order: validatedOrder, payment, inventory }); }); // Return success result return { orderId: validatedOrder.orderId, status: 'COMPLETED', payment: payment.id }; } catch (error) { // Return failure result return { orderId: flow.params.orderId, status: 'failed' }; } }, ); ``` ## [Key Concepts](#key-concepts) [Section titled “Key Concepts”](#key-concepts) [Workflow Definition ](../30-defining-workflows/)Use \`defineWorkflow\` to create workflows with input validation and retry policies. [Steps ](../40-workflow-activities/)Break workflows into discrete steps using \`flow\.do()\` for better error handling. [Parallel Execution ](../../deep-dive/50-parallel-processing/)Run independent steps concurrently using \`Promise.all()\` with multiple \`flow\.do()\` calls. [Error Handling ](../../deep-dive/10-error-handling/)Use try/catch and \`NonRetryableError\` for robust error handling. ## [Best Practices](#best-practices) [Section titled “Best Practices”](#best-practices) 1. **Input Validation** * Always validate workflow input using schemas * Define clear input and output types * Handle validation errors appropriately 2. **Error Handling** * Configure retry policies for transient failures * Use `NonRetryableError` for permanent failures * Log errors with context for debugging 3. **Step Design** * Keep steps focused and atomic * Use parallel execution for independent operations * Maintain idempotency for reliability ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) [Core Concepts ](../10-introduction-concepts/)Learn more about core concepts. [Workflow Rules ](../../deep-dive/05-workflow-rules/)Master workflow rules for reliability. [Error Handling ](../60-error-handling-retries/)Explore error handling strategies. # Defining Workflows > Learn the details of defining workflows, configuration, the execute function, and input validation. ## [Setting Up Your Development Environment](#setting-up-your-development-environment) [Section titled “Setting Up Your Development Environment”](#setting-up-your-development-environment) Before you can define workflows, you need to set up your project and install the necessary SDK. ### [Project Structure (Recommendation)](#project-structure-recommendation) [Section titled “Project Structure (Recommendation)”](#project-structure-recommendation) While IdentityFlow doesn’t enforce a specific structure for your *source* code if you’re building workflows as part of a larger application, the engine typically discovers workflow definitions from a specific location at runtime. For standard deployments, the recommended structure aligns with the engine’s default configuration: ```plaintext your-project-root/ ├── config/ # Engine configuration typically resides here │ ├── workflows/ # Recommended location for workflow definitions & related files │ │ ├── bindings/ # Bindings specific to these workflows │ │ │ └── graphql.ts │ │ ├── types/ # Optional custom types for these workflows │ │ │ └── custom-types.ts │ │ ├── simple-approval.workflow.ts │ │ ├── user-onboarding.flow.ts │ │ ├── package.json # Optional: If workflows have specific dependencies │ │ ├── tsconfig.json # Optional: If workflows need specific TS config │ │ └── ... │ └── identity-flow.conf # Example engine configuration file └── ... (other project files) ``` * **`config/workflows/`**: Placing your **workflow definition `.ts` files** here allows the engine to discover them automatically with minimal configuration. This is the **recommended location** for simpler deployments. The engine will compile/bundle these TypeScript files internally. * **Related Files**: You can also co-locate related files like `bindings`, custom `types`, and even `package.json` or `tsconfig.json` within this directory if your workflows form a somewhat self-contained unit or have specific dependencies/compilation needs separate from a main application. * **File Naming**: The engine often looks for files ending in `.workflow.ts`, `.flow.ts`, etc. (configurable). Using clear names like `user-onboarding.workflow.ts` is good practice. Placing workflows and their immediate dependencies in `config/workflows/` often simplifies deployment as the engine is configured to look there by default. ### [Workflow Definition Discovery by the Engine](#workflow-definition-discovery-by-the-engine) [Section titled “Workflow Definition Discovery by the Engine”](#workflow-definition-discovery-by-the-engine) * Your workflow definition `.ts` files (like `simple-flow.ts`) need to be placed in a location where the IdentityFlow engine can discover them. * The engine **compiles and bundles these TypeScript files internally** when it first discovers them or when they change. You **do not need a separate build step** for your workflow files. * Internal compilation uses [Sucrase](https://sucrase.io/) for fast TypeScript transformation. * Support is included for modern features like **dynamic imports** (`import('./module')`), **JSON imports** (`import data from './data.json' assert { type: 'json' }`), and **WASM imports** (`import wasm from './module.wasm' assert { type: 'wasm' }`). * The compiled/bundled output is stored internally by the engine (typically in the database) for efficient execution. * A common and recommended approach for standard deployments is to place your workflow definition `.ts` files within a `workflows` subdirectory relative to your engine’s configuration file (e.g., `config/workflows/`). * The engine automatically searches for files matching patterns like `**/*.workflow.ts`, `**/*.flow.ts`, etc., within the configured path(s). * **Default Include Patterns:** By default, the engine looks for files matching globs like `**/*.{request,approval,flow,workflow,wf}.?(m)[jt]s` and `**/workflows/*.?(m)[jt]s`. * **Default Exclude Patterns:** Common patterns like `.d.ts` files, test files (`*.test.ts`, `*.spec.ts`), configuration files, `node_modules`, `dist`, `build`, etc., are excluded by default. * You can customize the search paths and include/exclude patterns in the engine’s configuration file. * Refer to the IdentityFlow **Engine Deployment and Configuration Guide** (link TBD) for detailed options. * Ensure `draft: false` is set in your `defineWorkflow` configuration for definitions ready for use. ### [Installing the SDK](#installing-the-sdk) [Section titled “Installing the SDK”](#installing-the-sdk) The core package for defining workflows is `@identity-flow/sdk`. Install it in your project using your preferred package manager: * npm ```sh npm i @identity-flow/sdk ``` * pnpm ```sh pnpm add @identity-flow/sdk ``` * yarn ```sh yarn add @identity-flow/sdk ``` This package provides the `defineWorkflow` function and the `flow` helper object you’ll use extensively. *** ## [The `defineWorkflow` Function](#the-defineworkflow-function) [Section titled “The defineWorkflow Function”](#the-defineworkflow-function) Workflows are defined in TypeScript files using the `defineWorkflow` function exported by `@identity-flow/sdk`. This function takes two arguments: 1. **Configuration Object**: An object specifying metadata and default settings for the workflow. 2. **Execute Function**: An asynchronous function (`async (flow) => { ... }`) containing the actual logic of your workflow. ### [Basic Example (`simple-flow.ts`)](#basic-example-simple-flowts) [Section titled “Basic Example (simple-flow.ts)”](#basic-example-simple-flowts) Let’s look at a minimal workflow definition: src/workflows/simple-flow\.ts ```typescript import { defineWorkflow } from '@identity-flow/sdk'; export default defineWorkflow( { // Core identification name: '@carv/simple-flow', // Unique identifier for the workflow definition version: '1.0.0', // Semantic version for this definition // Optional metadata (useful for UI and organization) label: 'Simple Flow', // Human-readable name description: 'A basic example workflow', // Deployment setting draft: false, // Set to false to allow engine deployment }, // The main logic of the workflow async (flow) => { flow.log('Workflow instance started!'); // Your workflow steps will go here... const result = 'Hello from workflow!'; flow.log('Workflow instance finishing.'); return result; // The final result of the workflow instance }, ); ``` **Key Configuration Properties:** * `name`: (Required) A unique string identifying this workflow definition (e.g., `@your-org/process-name`). * `version`: (Required) A version string (preferably SemVer like `1.0.0`) to track changes. * `label`: (Optional) A user-friendly name displayed in UIs. * `description`: (Optional) A brief explanation of the workflow’s purpose. * `draft`: (Optional, defaults to `true`) Set to `false` to indicate the workflow is ready for deployment/use by the engine. Draft workflows are typically ignored by deployment processes. * `releaseChannel`: (Optional, defaults to `'latest'`) Used to tag specific versions (e.g., `'beta'`, `'stable'`). * `schema`: (Optional) A validation schema (e.g., from Valibot) for the input `params`. See the Validation section later. * `defaults`: (Optional) Default options (like retry settings) for all activities in this workflow. * `meta`: (Optional) Arbitrary JSON-serializable metadata to attach to the workflow definition. Read via `flow.definition.meta`. This definition exports a configuration object that the IdentityFlow engine can discover and register. The `execute` function is where the core logic resides, which we’ll explore next. *** ## [Writing Workflow Logic: The `execute` Function](#writing-workflow-logic-the-execute-function) [Section titled “Writing Workflow Logic: The execute Function”](#writing-workflow-logic-the-execute-function) The second argument to `defineWorkflow` is the `execute` function. This is where you implement the steps and logic of your process. ```typescript async (flow) => { // Your workflow logic goes here }; ``` ### [The `flow` Helper Object](#the-flow-helper-object) [Section titled “The flow Helper Object”](#the-flow-helper-object) The `execute` function receives a single argument, typically named `flow`, which is an object of type `Flow` (from `@identity-flow/sdk`). This object is your primary interface for interacting with the workflow engine and orchestrating activities. Key properties and methods on the `flow` object include: * **`flow.params`**: Accesses the input parameters passed when the workflow instance was started. If a `schema` was provided in the definition config, `params` will be the validated and typed output. ```typescript async (flow) => { const { userId, orderId } = flow.params; // Assuming schema validated these flow.log(`Processing order ${orderId} for user ${userId}`); }; ``` * **`flow.vars`**: A mutable object (`Record`) used primarily to **expose internal state externally** (e.g., for UI display via the GraphQL API). This state is persisted when the workflow waits (e.g., during `sleep`, `dialog`, `request`), allowing external systems to query the current status or progress. ```typescript async (flow) => { // Example: Expose progress for UI flow.vars.progress = { currentStep: 'Processing Payment', percentComplete: 50 }; // ... workflow pauses ... // Update exposed progress flow.vars.progress = { currentStep: 'Order Shipped', percentComplete: 100 }; }; ``` **Important Distinction**: While `flow.vars` *is* persisted, use regular function scope variables (`const`, `let`) for managing internal processing state between `await` points within a single execution run *before* the workflow pauses. `flow.vars` is best reserved for data you need to make visible *outside* the workflow execution itself. For robust state management across persisted steps, always rely on passing data through step inputs and outputs. > Learn more about managing state effectively in [Rule 3: Persist State Through Steps](../../deep-dive/05-workflow-rules/#3-persist-state-through-steps). * **`flow.instance`**: Provides read-only access to information about the current workflow instance (e.g., `flow.instance.id`, `flow.instance.createdAt`, `flow.instance.meta`). * **`flow.definition`**: Provides read-only access to the definition configuration this instance is running (e.g., `flow.definition.name`, `flow.definition.version`). * **Logging Methods (`flow.log`, `flow.info`, `flow.warn`, `flow.error`, `flow.debug`, `flow.trace`)**: Standard methods for logging messages during workflow execution. These logs are captured by the engine and associated with the workflow instance and specific step, aiding debugging. * **Activity Methods (`flow.do`, `flow.sleep`, `flow.dialog`, `flow.request`, `flow.start`)**: These are the core methods for defining the actual steps and pauses in your workflow. They are covered in detail in the [Workflow Activities](../40-workflow-activities/) guide. * **`flow.use()`**: A method for lazy-loading and injecting dependencies (like API clients or services). Covered in the [Dependencies & Logging](../50-dependencies-logging/) guide. * **`flow.assert()`**: Utility to assert a condition is truthy, throwing a retryable error if not. Details in [Error Handling & Retries](../60-error-handling-retries/). ### [Return Value](#return-value) [Section titled “Return Value”](#return-value) The value returned by the `execute` function becomes the final result of the workflow instance when it completes successfully. This result is recorded in the instance’s event history and can be retrieved via the GraphQL API. ```typescript async (flow) => { // ... logic ... const finalReport = { processedItems: 10, status: 'Completed' }; return finalReport; // This object is the workflow's result }; ``` *** ## [Input Validation with Schemas](#input-validation-with-schemas) [Section titled “Input Validation with Schemas”](#input-validation-with-schemas) Ensuring data integrity is crucial for robust workflows. IdentityFlow integrates seamlessly with validation libraries that adhere to the **Standard Schema specification (`StandardSchemaV1`)**, allowing you to define and enforce schemas for workflow parameters and activity results. [What is Standard Schema?](https://standardschema.dev/) **Built-in Support (Valibot):** IdentityFlow includes **Valibot** out-of-the-box. You **do not need to install it separately**. Access its functions conveniently via the SDK export: ```typescript import * as v from '@identity-flow/sdk/valibot'; ``` **Using Other Libraries (Standard Schema):** You can use *any* library compatible with `StandardSchemaV1` (like Zod, ArkType, Effect Schema, etc.). Simply import the library as usual and pass its schema objects where needed. **Schema Locations:** You can apply schemas in several places: 1. **Workflow Definition (`defineWorkflow`)**: Validates the initial `params` passed when starting a workflow instance. ```typescript import * as v from '@identity-flow/sdk/valibot'; // Example using Zod (assuming zod is installed in the project) // import * as z from 'zod'; // const zodSchema = z.object({ /* ... */ }); export default defineWorkflow( { name: 'process-order', version: '1.0.0', // Pass any StandardSchemaV1 compatible schema: schema: v.object({ // Validate flow.params using built-in Valibot orderId: v.string([v.uuid()]), customerId: v.string(), amount: v.number([v.minValue(0)]), items: v.array(v.object({ sku: v.string(), quantity: v.number() })), }), }, async (flow) => { // flow.params is guaranteed to match the schema here const { orderId, amount } = flow.params; // ... }, ); ``` If the input parameters provided when starting the workflow don’t match the schema, the instance creation will fail immediately. 2. **Activity Options (`flow.do`, `flow.dialog`, `flow.request`, `flow.start`)**: Validates the *result* returned by the activity or the data submitted externally to continue a `dialog` or `request`. ```typescript // Inside execute function: // Validate the result of a flow.do task const userInfo = await flow.do( 'fetch-user-profile', { schema: v.object({ name: v.string(), email: v.string([v.email()]) }) }, async ({ use }) => { const apiClient = use(ApiClientBinding); // Assuming ApiClientBinding is defined return await apiClient.getUserProfile(flow.params.userId); }, ); // userInfo is guaranteed to match the schema here // Validate the data submitted to continue a dialog const approval = await flow.dialog( 'manager-approval', { schema: v.object({ approved: v.boolean(), comments: v.optional(v.string()) }) }, ({ token }) => ({ params: { /* ... */ }, }), ); // approval is guaranteed to match the schema here ``` If the data returned by the activity or submitted via `continue` doesn’t match the schema provided in the options, the step will fail and potentially retry. **Benefits:** * **Type Safety**: Ensures data conforms to expected structures throughout the workflow. * **Early Error Detection**: Catches invalid data close to the source. * **Clear Contracts**: Defines clear expectations for inputs and outputs of workflows and steps. Leverage schema validation using your preferred Standard Schema library to build more reliable and predictable workflows. *** ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) [Workflow Activities ](../40-workflow-activities/)Learn how to use \`flow\.do\`, \`flow\.sleep\`, \`flow\.dialog\`, \`flow\.request\`, and \`flow\.start\`. [Error Handling & Retries ](../60-error-handling-retries/)Discover strategies for managing failures and configuring retries. [Dependencies & Logging ](../50-dependencies-logging/)Understand how to inject dependencies and use built-in logging. # Core Workflow Activities > Learn about flow.do, flow.sleep, flow.dialog, flow.request, and flow.start for building workflow steps. ## [Core Workflow Activities](#core-workflow-activities) [Section titled “Core Workflow Activities”](#core-workflow-activities) The `flow` object provides several asynchronous methods (`do`, `sleep`, `dialog`, `request`, `start`) to define the steps, pauses, and interactions of your workflow. The engine ensures these activities are executed durably and can be resumed. > Tip: Aim to make each activity step as granular as possible. See [Rule 2: Make Steps Granular](../../deep-dive/05-workflow-rules/#2-make-steps-granular). ### [Performing Tasks: `flow.do()`](#performing-tasks-flowdo) [Section titled “Performing Tasks: flow.do()”](#performing-tasks-flowdo) The `flow.do()` method is used to execute a unit of work, which can be synchronous or asynchronous (returning a Promise). It’s the most common activity for running your custom logic, calculations, or simple external calls. **Signature:** ```typescript flow.do(name: string, task: Task): Promise; flow.do(name: string, options: TaskOptions, task: Task): Promise; flow.do(name: string, options: WithSchema, task: Task): Promise; ``` * `name`: (Required) A unique string identifier for this step within the workflow definition. This name is crucial for idempotency. * `task`: (Required) An asynchronous function `async (ctx) => { ... }` that contains the logic for this step. It receives a `TaskContext` object (`ctx`) with properties like: * `ctx.step`: Information about the current step execution (e.g., `ctx.step.attempts`). * `ctx.use`: The dependency injection function (same as `flow.use`). * `ctx.signal`: An `AbortSignal` for cancellation handling. * `ctx.span`: The OpenTelemetry span for this task. * `options`: (Optional) An object to configure behavior like retries (`TaskOptions`) or add schema validation (`WithSchema`). * `Result`: The type of the value returned by the `task` function. **Idempotency and Caching:** The `name` provided to `flow.do()` is key. The workflow engine records the result of a successfully completed `flow.do()` step associated with its unique `name`. If the workflow restarts or replays due to an interruption, the engine will *not* re-execute the `task` function for a step with the same `name` that has already completed successfully. Instead, it will immediately return the previously recorded result. **This makes `flow.do()` steps inherently idempotent.** Ensure your step names are unique and descriptive within the workflow. > For a deeper dive into idempotency, see [Rule 1: Ensure Idempotency](../../deep-dive/05-workflow-rules/#1-ensure-idempotency) in our Workflow Rules guide. **Example:** ```typescript // Inside the execute function: const result = await flow.do('First step', async ({ step }) => { flow.log(`Executing First step, attempt: ${step.attempts}`); // Example: Simulate work that might fail initially if (step.attempts < 3) { flow.warn('Simulating failure on attempt', step.attempts); throw new Error(`Not ready yet: ${step.attempts}`); // Throwing an error triggers a retry (configurable) } // Simulate successful work const output = { message: 'First step completed successfully!', data: 123 }; flow.log('First step succeeded.'); return output; // This result is cached upon success }); // If the workflow resumes after this step, the above task function // will not re-run. `result` will contain the cached output. flow.log('Result from First step:', result); const result2 = await flow.do('Second step', () => { // Simple synchronous task flow.log('Executing Second step.'); return { output: 'Second step result' }; }); flow.log('Result from Second step:', result2); ``` **Retries:** If the `task` function throws an error, the engine will attempt to retry the step based on the configured retry strategy (either in the step’s `options` or the workflow `defaults`). See the [Error Handling & Retries](../60-error-handling-retries/) guide for more on configuring retries. ### [Introducing Delays: `flow.sleep()`](#introducing-delays-flowsleep) [Section titled “Introducing Delays: flow.sleep()”](#introducing-delays-flowsleep) Use `flow.sleep()` to pause the workflow execution for a specific duration or until a specific time. **Signature:** ```typescript flow.sleep(name: string, until: Duration | Date | SleepOptions): Promise; flow.sleep(name: string, until: Duration | Date | SleepOptions, value: T): Promise; ``` * `name`: (Required) A unique string identifier for this sleep step. * `until`: (Required) Specifies when the workflow should wake up. Can be: * A number (milliseconds duration). * A `Date` object (wake up at this specific time). * A duration string (e.g., `'5 seconds'`, `'1 minute'`, `'2 hours'`). * A `SleepOptions` object `{ until: ..., message?: ... }`. * `value`: (Optional) A value to return when the sleep duration completes. **Behavior:** When `flow.sleep()` is called, the workflow instance enters the `SLEEPING` state (while the overall instance remains `ACTIVE`). The engine schedules it to wake up at the specified time. Once woken, the workflow resumes execution from the point immediately after the `await flow.sleep(...)` call. **Use Cases:** * Implementing scheduled tasks. * Waiting for external processes that take a known amount of time. * Rate limiting or adding delays between steps. **Example:** ```typescript // Inside the execute function: flow.log('About to sleep...'); // Wait for 5 seconds await flow.sleep('Wait between steps', '5 seconds'); // Or wait until a specific date: // await flow.sleep('Wait until specific time', new Date('2024-12-31T23:59:59Z')); flow.log('Woke up after sleeping!'); // You can also return a value after sleeping const sleepResult = await flow.sleep('Short nap', 100, 'Slept for 100ms'); flow.log(sleepResult); // Outputs: Slept for 100ms ``` ### [Waiting for User Input: `flow.dialog()`](#waiting-for-user-input-flowdialog) [Section titled “Waiting for User Input: flow.dialog()”](#waiting-for-user-input-flowdialog) Use `flow.dialog()` when your workflow needs to pause and wait for input or confirmation from a human user (or an external system acting like one). **Signature:** ```typescript flow.dialog(name: string, define: DialogActivity): Promise; flow.dialog(name: string, options: DialogOptions, define: DialogActivity): Promise; flow.dialog(name: string, options: WithSchema, define: DialogActivity): Promise; ``` * `name`: (Required) A unique string identifier for this dialog step. * `define`: (Required) An asynchronous function `async (ctx) => { ... }` that defines the parameters for the dialog interaction. It receives an `ActivityContext` (`ctx`) and must return an object (`DialogInput`) containing: * `params`: Data needed by the UI or external system to present the dialog (e.g., form identifier, context data, user/role info). * `assignees`: (Optional) An array of subject strings (user/group IDs) who are eligible to respond to the dialog. * `until`: (Optional) A `Duration` or `Date` by which the dialog must be resolved, otherwise the step errors. * `message`: (Optional) A descriptive message for this step. * `options`: (Optional) An object to configure retries (`DialogOptions`) or add schema validation (`WithSchema`) *for the result* returned when the dialog is completed. * `Result`: The type of the data expected back when the dialog is resolved. * `Params`: The type of the `params` object returned by the `define` function. **Behavior:** 1. The `define` function is executed to generate the dialog parameters (`DialogInput`). 2. The workflow instance step enters the `PENDING` state (instance remains `ACTIVE`). 3. The engine records the `params`, `assignees`, `until`, and a unique `token` associated with this step. 4. The workflow pauses indefinitely (or until the `until` time is reached). 5. An external system (like a UI or notification service) must query the workflow instance state (via the GraphQL API), retrieve the `PENDING` step’s details (including `params` and `token`), present the necessary interface to the user, and collect their response. 6. Once the user responds, the external system calls the `workflowActivity(token: ...).continue(input: ...)` GraphQL mutation, providing the `token` and the user’s response data (`input.data`). 7. The engine receives the continuation request, finds the paused instance and step via the `token`, validates the submitted `data` against the `schema` (if provided in `options`), wakes up the workflow, and the `await flow.dialog(...)` call resolves, returning the validated `data`. **Example:** ```typescript // Inside the execute function import * as v from '@identity-flow/sdk/valibot'; flow.log('Waiting for form approval...'); const approvalResult = await flow.dialog( 'approve form', // Options specify the expected result schema { schema: v.object({ approved: v.boolean(), reason: v.nullish(v.string()) }) }, // Define function returns params for the UI and sets a timeout ({ token }) => ({ params: { form: 'one-form-approval', // Identifier for the UI component token // Pass the token needed to continue }, until: '30 minutes' // Step fails if not resolved in 30 mins }) ); // Execution resumes here after GraphQL workflowActivity(...).continue() is called flow.log('Approval result received:', approvalResult); if (approvalResult.approved) { flow.log('Form was approved.', approvalResult.reason ? `Reason: ${approvalResult.reason}` : ''); } else { flow.log('Form was rejected.', approvalResult.reason ? `Reason: ${approvalResult.reason}` : ''); // Potentially end the workflow early return { status: 'rejected' }; } ``` `flow.dialog` is essential for incorporating human decision points into your automated processes. ### [Requesting External Data: `flow.request()`](#requesting-external-data-flowrequest) [Section titled “Requesting External Data: flow.request()”](#requesting-external-data-flowrequest) Use `flow.request()` when your workflow needs to pause and wait for external data or status updates. **Signature:** ```typescript flow.request(name: string, options: RequestOptions, activity: Activity): Promise; ``` * `name`: (Required) A unique string identifier for this request step. * `options`: (Required) An object to configure the request behavior, including schema validation and polling configuration. * `activity`: (Required) An asynchronous function `async (ctx) => { ... }` that contains the logic for this step. It receives a `RequestContext` (`ctx`) with properties like: * `ctx.step`: Information about the current step execution (e.g., `ctx.step.attempts`). * `ctx.use`: The dependency injection function (same as `flow.use`). * `ctx.signal`: An `AbortSignal` for cancellation handling. * `ctx.span`: The OpenTelemetry span for this task. * `ctx.token`: A unique token for this request activity, used for external continuation. * `Result`: The type of data expected back when the request is completed (either via external continuation or successful polling). * `Data`: The type of data returned by the initial `activity` function (often used for polling). **Behavior:** * **External Continuation (Default):** 1. The `activity` function executes, initiating the external process and potentially passing the unique `ctx.token` to the external system. 2. The workflow instance step enters the `REQUESTING` state (instance remains `ACTIVE`) and hibernates. 3. The external system performs its work asynchronously. 4. When complete, the external system must call back to the IdentityFlow engine, typically via the `workflowActivity(token: ...).continue(input: ...)` GraphQL mutation, providing the `token` and the result `data`. 5. The engine validates the result against the `schema` (if provided), wakes the workflow, and the `await flow.request(...)` call resolves with the result. * **Polling:** 1. The `activity` function executes, initiates the external process, and returns data (`Data`) needed for polling (e.g., a job ID). 2. The workflow instance step enters the `REQUESTING` state. 3. The engine periodically executes the provided `poll.callback` function according to the `poll.retries` configuration. 4. The `poll.callback` function checks the status of the external operation (using the data returned by the initial `activity`). It should return `{ success: true, data: Result }` when the operation is complete, or `{ success: false }` to continue polling. 5. If the `poll.callback` returns success, the `await flow.request(...)` call resolves with the `Result` data. 6. If polling attempts are exhausted without success, the step fails. **Use Cases:** * Interacting with asynchronous APIs that use webhooks/callbacks. * Starting long-running batch jobs and waiting for their completion. * Integrating with systems where status needs to be checked periodically. **Example (Conceptual - Callback):** ```typescript // Inside the execute function: import * as v from '@identity-flow/sdk/valibot'; flow.log('Initiating external report generation...'); const reportResult = await flow.request( 'generate-report', // Define expected result structure { schema: v.object({ reportUrl: v.string(), status: v.literal('completed') }) }, // Activity function initiates the request and passes the token async ({ token, use }) => { const reportService = use(ReportServiceBinding); // Assuming ReportServiceBinding is defined // Tell the external service to start generation and notify via callback // including the unique token. await reportService.startReportGeneration({ userId: flow.params.userId, callbackToken: token, // External service uses this token to continue callbackUrl: 'https://identity-flow.example.com/api/callback', // Example callback endpoint }); flow.log('Report generation initiated, waiting for callback...'); // No return value needed here as we wait for external continuation }, ); // Execution resumes here after the external service calls back via API flow.log(`Report generated successfully: ${reportResult.reportUrl}`); ``` **Example (Conceptual - Polling):** ```typescript // Inside the execute function: import * as v from '@identity-flow/sdk/valibot'; flow.log('Starting batch job...'); const jobResult = await flow.request( 'process-batch-job', { // Expected result when polling succeeds schema: v.object({ finalStatus: v.string(), outputLocation: v.string() }), // Polling configuration poll: { // Check every 5 minutes, retry up to 10 times retries: { limit: 10, delay: '5 minutes' }, // Polling callback function callback: async ({ use, step }) => { const jobService = use(JobServiceBinding); // Assuming JobServiceBinding defined const jobId = step.data?.jobId; // Access data returned by initial activity if (!jobId) throw new Error('Missing Job ID for polling'); const status = await jobService.getJobStatus(jobId); flow.debug(`Polling job ${jobId}: Status is ${status.state}`); if (status.state === 'COMPLETED') { return { success: true, data: { finalStatus: status.state, outputLocation: status.resultsUrl }, }; } else if (status.state === 'FAILED') { throw new Error(`Job ${jobId} failed externally: ${status.errorMessage}`); } // Otherwise, continue polling return { success: false }; }, }, }, // Initial activity starts the job and returns data needed for polling async ({ use }) => { const jobService = use(JobServiceBinding); // Assuming JobServiceBinding defined const jobInfo = await jobService.startBatchJob({ inputData: flow.params.batchData }); flow.log(`Batch job started with ID: ${jobInfo.jobId}`); // Return data needed by the polling callback return { jobId: jobInfo.jobId }; }, ); // Execution resumes here after polling succeeds flow.log(`Batch job ${jobResult.finalStatus}, output: ${jobResult.outputLocation}`); ``` `flow.request()` handles scenarios where the workflow needs to pause and wait for external triggers or periodic checks. ### [Starting Sub-Workflows: `flow.start()`](#starting-sub-workflows-flowstart) [Section titled “Starting Sub-Workflows: flow.start()”](#starting-sub-workflows-flowstart) For complex processes, you can break down logic into smaller, reusable workflows and orchestrate them using `flow.start()`. This allows you to start another workflow definition (a “sub-workflow” or “child workflow”) and wait for its completion. **Signature:** ```typescript flow.start(name: string, activity: StartWorkflowActivity): Promise; flow.start(name: string, options: ActivityOptions, activity: StartWorkflowActivity): Promise; flow.start(name: string, options: WithSchema, activity: StartWorkflowActivity): Promise; ``` * `name`: (Required) A unique string identifier for this sub-workflow step. * `activity`: (Required) An asynchronous function `async (ctx) => { ... }` that defines which sub-workflow to start and with which parameters. It receives a `StartWorkflowContext` (`ctx`) and must return an object (`StartWorkflowInput`) containing: * `workflow`: The name (or ID) of the workflow definition to start. * `params`: The input parameters to pass to the sub-workflow. * `version`: (Optional) A specific version or range of the sub-workflow definition to use. * `releaseChannel`: (Optional) A specific release channel (e.g., `'latest'`) of the sub-workflow definition. * `label`, `description`, `recipients`, `until`, `meta`: (Optional) Properties to set on the sub-workflow instance. * `options`: (Optional) An object to configure retries (`ActivityOptions`) for the *initiation* of the sub-workflow or add schema validation (`WithSchema`) for the `Result` expected *back* from the sub-workflow upon its completion. * `Result`: The type of the final result returned by the sub-workflow. * `Params`: The type of the `params` passed to the sub-workflow. **Behavior:** 1. The `activity` function executes to determine the sub-workflow definition and its input parameters. 2. The engine initiates a new instance of the specified sub-workflow definition. 3. The current workflow instance step enters the `RELYING` state (instance remains `ACTIVE`) and pauses. 4. The sub-workflow instance executes independently. 5. When the sub-workflow instance completes successfully, its final result is recorded. 6. The engine wakes up the original (parent) workflow. 7. The `await flow.start(...)` call resolves, returning the result from the completed sub-workflow. 8. If the sub-workflow fails or is terminated, the `flow.start()` step in the parent workflow will also typically fail (subject to retry configurations on the `flow.start` options). **Use Cases:** * Reusing common logic (e.g., a standard notification workflow, an approval sub-process). * Breaking down very large or complex workflows into manageable, independent parts. * Implementing patterns like dynamic parallel execution (by starting multiple sub-workflows). **Example (Conceptual):** src/workflows/main-process.ts ```typescript import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; export default defineWorkflow({ name: '@my-org/main-process', version: '1.0.0' }, async (flow) => { flow.log('Main process starting.'); const userData = await flow.do('fetch-user-data', async () => ({ name: 'Alice', id: 'user123' })); flow.log('Starting notification sub-workflow...'); const notificationResult = await flow.start( 'send-welcome-email', { schema: v.object({ delivered: v.boolean(), messageId: v.string() }) }, (ctx) => ({ workflow: '@my-org/send-notification', params: { recipientId: userData.id, subject: 'Welcome!', body: `Hi ${userData.name}, welcome aboard!`, }, label: `Welcome Email for ${userData.name}`, }), ); if (notificationResult.delivered) { flow.log(`Welcome email sent successfully, Message ID: ${notificationResult.messageId}`); } else { flow.warn('Welcome email failed to deliver.'); } flow.log('Main process finished.'); return { status: 'complete' }; }); ``` ```typescript // src/workflows/send-notification.ts (Sub-workflow example) import * as v from '@identity-flow/sdk/valibot'; import { defineWorkflow } from '@identity-flow/sdk'; // import { EmailBinding } from '../bindings/email'; // Assuming EmailBinding is defined export default defineWorkflow( { name: '@my-org/send-notification', version: '1.0.0', schema: v.object({ recipientId: v.string(), subject: v.string(), body: v.string() }), }, async (flow) => { const { recipientId, subject, body } = flow.params; flow.log(`Attempting to send notification to ${recipientId}`); const sendResult = await flow.do('send-via-email-service', async ({ use }) => { // const emailService = use(EmailBinding); // const messageId = await emailService.send(recipientId, subject, body); // return { delivered: true, messageId }; // For example purposes: flow.log('Simulating email send...'); return { delivered: true, messageId: 'fake-message-id-' + Math.random().toString(36).substring(7), }; }); flow.log('Notification sub-workflow finished.'); return sendResult; }, ); ``` Using `flow.start()` promotes modularity and reuse in complex workflow scenarios. *** ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Master the other core components of workflow development: [Dependencies & Logging ](../50-dependencies-logging/)Learn how to inject dependencies using flow\.use and utilize logging. [Error Handling & Retries ](../60-error-handling-retries/)Configure retry policies and handle errors gracefully. [State, Lifecycle & Deployment ](../70-state-lifecycle-deployment/)Understand workflow states and deployment considerations. # Dependencies & Logging > Manage external dependencies with flow.use() and utilize built-in logging methods in your workflows. ## [Managing Dependencies: `flow.use()`](#managing-dependencies-flowuse) [Section titled “Managing Dependencies: flow.use()”](#managing-dependencies-flowuse) Workflows often need to interact with external services, like databases, APIs, or notification systems. `flow.use()` provides a mechanism for **lazy-loaded dependency injection**. **Signature:** ```typescript flow.use(binding: Binding): T; type Binding = () => T; ``` * `binding`: (Required) A function that takes no arguments and returns an instance of the dependency (e.g., an API client, a database connection pool). This function is called the *binding function*. * `T`: The type of the dependency instance returned by the binding function. **Behavior:** 1. The first time `flow.use()` is called with a specific `binding` function *reference* within a workflow instance, the `binding` function is executed, and its return value (the dependency instance) is cached. 2. Subsequent calls to `flow.use()` with the *exact same* `binding` function reference within the same workflow instance will return the cached instance directly, without re-executing the binding function. **Why Use `flow.use()`?** * **Lazy Loading**: Dependencies are only initialized if and when they are actually needed by a workflow step, saving resources. * **Caching/Singleton**: Ensures that only one instance of a dependency (per binding function) is created per workflow instance, preventing issues with multiple connections or inconsistent client states. * **Testability**: Makes it easier to mock dependencies during testing by providing mock binding functions. * **Determinism**: Helps maintain workflow determinism by ensuring dependencies are initialized consistently. **Important: Stable Function References** The caching mechanism relies on the *reference* of the `binding` function passed to `flow.use()`. To ensure caching works correctly: * **DO** define your binding functions as standalone, named constants or functions, typically in a separate file (like `src/bindings/graphql.ts`). * **DO NOT** use inline anonymous functions (`flow.use(() => new ApiClient())`) inside your workflow logic, as this creates a new function reference on every call, defeating the cache. **Example:** src/bindings/graphql.ts ```typescript import { defineBinding } from '@identity-flow/sdk'; import { GraphQLClient } from 'graphql-request'; // Optional helper const createClient = () => { console.log('Initializing GraphQL Client...'); // This will only log once per workflow instance return new GraphQLClient('http://localhost:4000/graphql'); }; // Define the binding function using a stable const reference export const GraphqlBinding = defineBinding(createClient); ``` src/workflows/my-workflow\.ts ```typescript import { defineWorkflow } from '@identity-flow/sdk'; import { GraphqlBinding } from '../bindings/graphql'; // Import the stable binding export default defineWorkflow({ name: '@my-org/my-workflow', version: '1.0.0' }, async (flow) => { // First use: initializes the client via GraphqlBinding() const client1 = flow.use(GraphqlBinding); flow.log('Used client 1'); await flow.do('some-task', async ({ use }) => { // Can also use `use` from the task context // Second use (same instance): returns the cached client const client2 = use(GraphqlBinding); flow.log('Used client 2'); // client1 === client2 will be true const data = await client2.request('{ /* some query */ }'); return data; }); }); ``` The `defineBinding` helper is optional but can provide better type inference in some scenarios. Use `flow.use()` consistently to manage external service clients and resources within your workflows. *** ## [Logging Methods](#logging-methods) [Section titled “Logging Methods”](#logging-methods) The `flow` object provides standard methods for logging messages during workflow execution. These logs are captured by the engine and associated with the workflow instance and specific step, aiding debugging. Available methods: * `flow.log(...data: any[])` * `flow.info(...data: any[])` * `flow.warn(...data: any[])` * `flow.error(...data: any[])` * `flow.debug(...data: any[])` * `flow.trace(...data: any[])` **Example:** ```typescript async (flow) => { flow.info('Starting user data fetch...'); try { // ... fetch data flow.debug('User data fetched successfully', { userId: flow.params.userId }); } catch (error) { flow.error('Failed to fetch user data', error); } }; ``` *** ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) With dependencies and logging covered, explore how to handle failures and ensure your workflows are robust: [Error Handling & Retries ](../60-error-handling-retries/)Learn to manage errors and configure retry policies. [State, Lifecycle & Deployment ](../70-state-lifecycle-deployment/)Understand workflow states, their lifecycle, and deployment. # Error Handling & Retries > Learn to manage errors, configure retry policies, and use assertions in your workflows. ## [Customizing Retries](#customizing-retries) [Section titled “Customizing Retries”](#customizing-retries) By default, failed steps (`flow.do`, `flow.request`, `flow.start`, `flow.dialog` initiation) are retried a few times with a constant delay. You can customize this behavior globally for a workflow or per-step. **Retry Configuration (`RetryStrategy`):** ```typescript interface RetryStrategy { limit?: number | false; // Max attempts (false = infinite, 0 = no retries). Default: 5 delay?: Duration; // Delay between retries (ms or string like '1 minute'). Default: '1 minute' backoff?: 'constant' | 'linear' | 'exponential'; // Backoff strategy. Default: 'constant' } ``` **Applying Retries:** 1. **Workflow Defaults:** Set in the `defineWorkflow` configuration under the `defaults` key. See [Defining Workflows](../30-defining-workflows/#the-defineworkflow-function). ```typescript defineWorkflow( { // ... other config defaults: { retries: { limit: 3, delay: '30 seconds', backoff: 'exponential' } }, }, async (flow) => { /* ... */ }, ); ``` 2. **Per-Step Options:** Override defaults by passing `retries` in the `options` argument of an activity. ```typescript await flow.do( 'call-flaky-api', { retries: { limit: 10, delay: '5 seconds' } }, async ({ use }) => { /* ... */ }, ); // Disable retries for a specific step await flow.do( 'critical-once-only-task', { retries: false }, // Equivalent to { retries: { limit: 0 } } async ({ use }) => { /* ... */ }, ); ``` Choose retry strategies appropriate for the type of failure expected (e.g., exponential backoff for overloaded services). *** ## [Structured Error Handling](#structured-error-handling) [Section titled “Structured Error Handling”](#structured-error-handling) IdentityFlow provides structured error handling through custom error classes and utility functions exported from `@identity-flow/sdk`. This allows for more precise error management than relying solely on standard JavaScript `Error` objects. **Key Exports from `@identity-flow/sdk` (related to errors):** * **Error Classes**: * `WorkflowError`: Base class for workflow-related errors. * `NonRetryableError`: For errors that should not trigger the retry mechanism. * `ValidationError`: For issues related to data validation (often thrown automatically by schema checks, but can be used manually). * `TimeoutError`: A specific `NonRetryableError` for timeouts. * `AssertionError`: Thrown by `flow.assert()` or can be used for custom assertions. * `LockedError`: Indicates a resource is locked. * **Type Guard Functions**: * `isNonRetryableError(error: unknown): error is NonRetryableError` * `isValidationError(error: unknown): error is ValidationError` * `isLockedError(error: unknown): error is LockedError` * `isAbortError(error: unknown): error is AbortError` (for handling `AbortSignal` related errors) * **Other Utilities**: * `throwIfAborted(value: unknown)`: Checks and throws if an `AbortSignal` has been aborted. **Throwing Specific Errors:** When an activity function needs to signal a failure, throwing an instance of these specific error classes provides more context to the engine and to any catching logic. ```typescript import { NonRetryableError, ValidationError, defineWorkflow } from '@identity-flow/sdk'; // ... in an activity function ... if (criticalConditionFailed) { throw new NonRetryableError('Critical condition failed, will not retry.', { code: 'CRITICAL_FAILURE', }); } if (userInputInvalid) { throw new ValidationError('Invalid user input provided.', { issues: [{ message: 'Email is required', path: ['email'] }], }); } ``` **Catching and Checking Errors:** You can use standard `try...catch` blocks and the provided type guards to handle errors gracefully and specifically. ```typescript import { defineWorkflow, isNonRetryableError, isValidationError } from '@identity-flow/sdk'; // ... in execute function or an activity ... try { await flow.do('risky-operation', async () => { /* ... may throw custom errors ... */ }); } catch (error) { if (isValidationError(error)) { flow.warn('Validation failed during risky operation:', error.message, error.issues); // Handle validation error, maybe inform user or return specific result return { status: 'VALIDATION_ERROR', issues: error.issues }; } else if (isNonRetryableError(error)) { flow.error('Non-retryable error in risky operation:', error.message, { code: error.code }); // Perform cleanup for non-retryable failure await flow.do('cleanup-non-retryable', async () => { /* ... */ }); return { status: 'FATAL_ERROR', reason: error.message }; } else { // Generic error, might be retried by the engine or bubble up flow.error('Risky operation failed:', error); throw error; // Re-throw if you want the engine to handle retries or fail the workflow } } ``` * **Unrecoverable Errors:** If an error occurs outside of a configured retry mechanism (e.g., a `NonRetryableError` is thrown, or after all retries for a retryable error are exhausted), the workflow instance will typically transition to the `FAILED` state. Using these specific error types and utilities enhances the robustness and debuggability of your workflows. ### [Assertions with `flow.assert()`](#assertions-with-flowassert) [Section titled “Assertions with flow.assert()”](#assertions-with-flowassert) The `flow.assert(condition, messageOrError)` utility is a convenient way to perform simple precondition checks within your workflow logic. If the `condition` is false, it throws an `AssertionError` (which is a `NonRetryableError`) with the given message, or throws the provided error object. **Signature:** ```typescript flow.assert(condition: any, messageOrError?: string | ErrorDetails | Error): asserts condition; ``` **Example:** ```typescript async (flow) => { const { orderId, items } = flow.params; flow.assert(orderId, 'Order ID is required.'); flow.assert(items && items.length > 0, { message: 'Order must contain items.', code: 'EMPTY_ORDER', }); // ... rest of the workflow logic ... }; ``` If an assertion fails, the workflow will typically terminate unless the `AssertionError` is caught and handled specifically. *** ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Complete your understanding of workflow development with these topics: [State, Lifecycle & Deployment ](../70-state-lifecycle-deployment/)Understand workflow states, their lifecycle, and deployment. [Workflow Rules ](../../deep-dive/05-workflow-rules/)Master essential guidelines for building reliable and maintainable workflows. # State, Lifecycle & Deployment > Understand workflow states, lifecycle, metadata, deployment, and best practices. ## [Understanding Workflow State & Lifecycle](#understanding-workflow-state--lifecycle) [Section titled “Understanding Workflow State & Lifecycle”](#understanding-workflow-state--lifecycle) Workflows and their individual steps transition through various states during execution. Understanding these helps in debugging and managing your automated processes. ### [Workflow Instance States](#workflow-instance-states) [Section titled “Workflow Instance States”](#workflow-instance-states) A workflow instance represents a single, end-to-end execution of a workflow definition. Its status reflects the overall progress of that execution. The primary instance states, as reflected in the GraphQL API (`WorkflowInstanceStatus`), are: * **`ACTIVE`**: The workflow instance has started and is currently processing, or is waiting for one of its steps to complete (e.g., a step is `SLEEPING`, `PENDING`, `REQUESTING`, or `RELYING`). It has not yet reached a final state. * **`PAUSED`**: The instance was explicitly paused via an API call (e.g., GraphQL `pause` mutation). It can be resumed later via the `resume` mutation, returning it to its previous operational state (which would typically be `ACTIVE` while its current step continues). * **`COMPLETED`**: The workflow’s `execute` function has successfully run to completion and returned a value. This is a final state. * **`FAILED`**: An unrecoverable error occurred during the workflow’s execution (e.g., an error thrown after exhausting all retries for a step, or an internal engine error). This is a final state. * **`TERMINATED`**: The instance was explicitly stopped via an API call (e.g., GraphQL `terminate` mutation). This is a final state. You can query the current `status` of any workflow instance using the GraphQL API. ### [Workflow Step States](#workflow-step-states) [Section titled “Workflow Step States”](#workflow-step-states) Each activity invoked within a workflow (like `flow.do`, `flow.sleep`, `flow.dialog`, etc.) becomes a “step” with its own lifecycle. The state of the *current* step often dictates why an `ACTIVE` instance is waiting. Key step states (`WorkflowStepStatus`) include: * **`ACTIVE`**: The step’s main logic (e.g., the task function in `flow.do`) is currently executing. * **`SLEEPING`**: The step was initiated by `flow.sleep()` and is paused until the specified time. The instance remains `ACTIVE`. * **`PENDING`**: The step was initiated by `flow.dialog()` and is waiting for an external continuation (e.g., user input via UI). The instance remains `ACTIVE`. * **`REQUESTING`**: The step was initiated by `flow.request()` and is waiting for an external callback or for its polling condition to be met. The instance remains `ACTIVE`. * **`RELYING`**: The step was initiated by `flow.start()` and is waiting for the initiated sub-workflow instance to complete. The instance remains `ACTIVE`. * **`CONTINUING`**: A transient state when a `PENDING` or `REQUESTING` step receives a continuation request and is processing the incoming data before resolving or potentially failing. * **`COMPLETED`**: The step’s logic has finished successfully. * **`ERRORED`**: The step’s logic encountered an error. If retries are configured, it might transition back to `ACTIVE` or a waiting state. If all retries are exhausted, the step remains `ERRORED`, which may then cause the parent workflow instance to transition to `FAILED`. Understanding both instance and step states is key. For example, a workflow *instance* can be `ACTIVE` because its current *step* is `SLEEPING`. ### [Lifecycle & Event Sourcing](#lifecycle--event-sourcing) [Section titled “Lifecycle & Event Sourcing”](#lifecycle--event-sourcing) Every transition between these states (for both instances and steps), every activity completion, every error, and every variable update (`flow.vars`) is recorded as an immutable event in the instance’s history. * When a workflow needs to resume (e.g., after a `flow.sleep` step completes or a `flow.dialog` step is externally continued), the engine replays the relevant events to reconstruct the exact state (including `flow.vars` and dependency instances from `flow.use`) where it left off. * This event sourcing mechanism is what provides the durability, auditability, and resilience of IdentityFlow workflows. * You can query the full event history for any instance via the GraphQL API (`instance.events`) for detailed debugging and auditing. While you don’t typically interact with the event log directly within your workflow code, understanding its existence and purpose helps grasp how the engine manages state and resumes execution reliably. *** ## [Workflow Metadata (`meta`)](#workflow-metadata-meta) [Section titled “Workflow Metadata (meta)”](#workflow-metadata-meta) You can attach arbitrary JSON-serializable metadata to workflow definitions and instances: * **Definition Metadata:** Set in `defineWorkflow` config. See [Defining Workflows](../30-defining-workflows/#the-defineworkflow-function). ```typescript defineWorkflow( { // ... other config meta: { ownerTeam: 'billing', criticality: 'high' }, } /* ... */, ); ``` * **Instance Metadata:** Passed via the `meta` property when starting an instance (e.g., via the GraphQL `start` mutation) or when starting a sub-workflow with `flow.start`. ```typescript // Inside flow.start activity function (ctx) => ({ workflow: 'sub-workflow-name', params: { /* ... */ }, meta: { correlationId: flow.instance.meta.correlationId || flow.instance.id, parentId: flow.instance.id, }, }); ``` * **Accessing Metadata:** Read via `flow.definition.meta` and `flow.instance.meta`. * **System Metadata:** IdentityFlow automatically includes system metadata like `traceId`, `spanId`, `correlationId`, `causationId` in `flow.instance.meta` for observability and tracing across instances and steps. While optional according to the type definitions, these fields are typically populated or propagated by the standard IdentityFlow server implementation based on engine configuration and incoming requests. Metadata is useful for categorization, routing, reporting, and tracing workflows. *** ## [Deployment & Execution (High-Level)](#deployment--execution-high-level) [Section titled “Deployment & Execution (High-Level)”](#deployment--execution-high-level) This guide focuses on *developing* workflows using the `@identity-flow/sdk`. Once you have defined your workflow in a TypeScript file, you need to make it available to the IdentityFlow engine and then manage its execution (instances). Details on workflow definition discovery by the engine (compilation, bundling, search paths) can be found in the [Defining Workflows](../30-defining-workflows/#workflow-definition-discovery-by-the-engine) guide. **Managing Instances via GraphQL API:** While the SDK defines *how* a workflow behaves, you typically *manage* its execution—starting new instances, querying status, and handling interactions—through the IdentityFlow **GraphQL API** (usually available at `http://localhost:4000/graphql` locally). Here are some key operations you’ll perform via the API: * **Starting a Workflow Instance:** Use the `workflowDefinition` mutation to find your registered definition and then call `start`. ```graphql mutation StartSimpleFlow($params: JSON!) { workflowDefinition(by: { name: "@carv/simple-flow", releaseChannel: "latest" }) { start(input: { params: $params, label: "My Simple Instance" }) { success instance { id status } error { message } } } } ``` * **Querying Instances:** Find instances by ID, status, definition, etc., using the `workflowInstance` or `workflowInstances` queries. ```graphql query GetInstanceStatus($instanceId: ID!) { workflowInstance(by: { id: $instanceId }) { id label status message createdAt updatedAt finishedAt vars # Access exposed variables steps { nodes { id name status kind } } events { nodes { id type message createdAt } } } } ``` * **Continuing Pending Steps (`dialog`/`request`):** When a workflow is `PENDING` or `REQUESTING` after `flow.dialog` or `flow.request`, find the step’s `token` (often included in `step.data` or `instance.vars` depending on your `define` function) and use the `workflowActivity` mutation. ```graphql mutation ContinueDialog($token: ID!, $responseData: JSON!) { workflowActivity(token: $token) { continue(input: { status: COMPLETED, data: $responseData }) { success instance { id status # Should now be ACTIVE or another state } error { message } } } } ``` * **Controlling Instances (`pause`, `resume`, `terminate`):** Use mutations on `workflowInstance` or `workflowInstances` to control the lifecycle. ```graphql mutation PauseInstance($instanceId: ID!) { workflowInstance(by: { id: $instanceId }) { pause(input: { reason: "Manual intervention needed" }) { success instance { status } } } } # Similar mutations exist for resume and terminate ``` **Summary:** * **SDK (`@identity-flow/sdk`)**: You write `.ts` files using `defineWorkflow` to define the logic. * **Engine**: Registers these definitions (details in [Defining Workflows](../30-defining-workflows/)). * **GraphQL API**: You interact with the engine to start, query, continue, and control instances of those definitions. Keep this separation in mind as you build and operate your workflows. *** ## [Troubleshooting & Best Practices](#troubleshooting--best-practices) [Section titled “Troubleshooting & Best Practices”](#troubleshooting--best-practices) Here are some tips for developing and debugging IdentityFlow workflows: > For a comprehensive set of guidelines, also refer to our [Rules of Workflows](../../deep-dive/05-workflow-rules/) document. **Common Pitfalls:** * **Non-Deterministic Code:** Avoid logic within your `execute` function that relies on non-deterministic sources (like `Math.random()`, `new Date()` without caching, or external API calls without `flow.do` or `flow.request`). The engine relies on deterministic replay for resilience. (See [Workflow Rule #5](../../deep-dive/05-workflow-rules/#5-use-deterministic-step-names)). * **Incorrect `flow.use()` Usage:** Using inline anonymous functions for bindings (`flow.use(() => new Client())`) will bypass caching and potentially create resource leaks. Always use stable function references. (See [Dependencies & Logging](../50-dependencies-logging/#managing-dependencies-flowuse)). * **Missing `await`:** Forgetting `await` on `flow.do`, `flow.sleep`, etc., will cause unexpected behavior as the workflow won’t pause correctly. * **Unclear Step Names:** Using generic or duplicate names for steps (`flow.do('step1', ...)` multiple times) breaks idempotency and makes debugging harder. (See [Workflow Rule #5](../../deep-dive/05-workflow-rules/#5-use-deterministic-step-names)). * **Large `flow.vars`:** Avoid storing excessive or complex data in `flow.vars`. Use it primarily for exposing essential status or progress information externally. (See [Workflow Rule #3](../../deep-dive/05-workflow-rules/#3-persist-state-through-steps)). **Debugging Tips:** * **Use Logging Extensively:** Add `flow.log`, `flow.debug`, `flow.info`, `flow.warn`, `flow.error` calls throughout your workflow logic. These logs are associated with the instance and step in the engine. (See [Dependencies & Logging](../50-dependencies-logging/#logging-methods)). * **Query Instance History:** Use the GraphQL API to inspect the `status`, `vars`, `steps`, and especially the `events` history of a workflow instance. The event log provides a detailed trace of execution. * **Check Engine Logs:** Consult the logs of the IdentityFlow engine itself for lower-level errors or processing details. * **Isolate Issues:** If a complex workflow fails, try commenting out sections or simplifying logic to pinpoint the problematic step or interaction. * **Test Individual Steps:** Write unit tests for complex logic within `flow.do` tasks where possible. **Best Practices Summary:** * **Keep Workflows Focused:** Aim for workflows that model a single, well-defined business process. * **Use Sub-Workflows (`flow.start`):** Break down complex processes into smaller, reusable sub-workflows. (See [Core Workflow Activities](../40-workflow-activities/#starting-sub-workflows-flowstart)). * **Validate Inputs/Outputs:** Use schema validation (`schema` option) for workflow `params` and critical activity results. (See [Defining Workflows](../30-defining-workflows/#input-validation-with-schemas)). * **Handle Errors Gracefully:** Use `try...catch` for expected errors and configure sensible `retries`. (See [Error Handling & Retries](../60-error-handling-retries/)). * **Use Descriptive Names:** Give clear, unique names to workflow definitions and steps. * **Manage Dependencies Correctly:** Use `flow.use` with stable binding functions for external services. * **Leverage `flow.vars` Appropriately:** Expose meaningful status or progress via `flow.vars`, but manage internal processing state with standard variables. By following these guidelines, you can build robust, maintainable, and debuggable workflows with IdentityFlow. *** ## [Next Steps](#next-steps) [Section titled “Next Steps”](#next-steps) Now that you have a comprehensive understanding of developing guides, explore advanced topics and practical examples: [Deep Dive Topics ](../../deep-dive/)Explore advanced concepts like Workflow Rules, Testing, and Observability. [Workflow Examples ](../../examples/)See practical implementations of common workflow patterns. # LLM Context Files > Information about the automatically generated context files for Large Language Models. This documentation site automatically generates context files optimized for use with Large Language Models (LLMs). These files provide a structured way for LLMs to understand the content of this site. You can learn more about the `llms.txt` initiative at [llmstxt.org](https://llmstxt.org/). The following context files are generated: ## [Documentation Sets](#documentation-sets) [Section titled “Documentation Sets”](#documentation-sets) * [Overview documentation (`llms.txt`)](https://flow.identity-hub.io/llms.txt): An overview of the documentation content, primarily pointing to the abridged and complete versions. * [Abridged documentation (`llms-small.txt`)](https://flow.identity-hub.io/llms-small.txt): A compact version of the documentation for IdentityFlow, with non-essential content removed. This is ideal for quick queries or when token limits are a concern. * [Complete documentation (`llms-full.txt`)](https://flow.identity-hub.io/llms-full.txt): The full documentation for IdentityFlow, including all pages and details. This is suitable for comprehensive understanding or in-depth research. ## [Notes](#notes) [Section titled “Notes”](#notes) * The content for these files is automatically generated from the same source as the official documentation, ensuring they are always up-to-date. # Modern Workflow Engine > A fully event-driven, code-first workflow engine designed for transparent, flexible, and auditable process automation. IdentityFlow is proudly engineered by **[Kenoxa GmbH](https://kenoxa.de)**, your partners in pioneering workflow automation. # Resilient automation that never misses a beat. IdentityFlow keeps every step of your process transparent and traceable. Define workflows in JavaScript, execute with confidence, and maintain full visibility throughout. ### Where is my workflow? Real-time status updates show you exactly which actions are in progress or stalled. ### What happened so far? A comprehensive event log traces every transition, offering full visibility into your process history. For Business Stakeholders Automate complex, multi-step business processes with complete visibility. Every action is recorded, providing real-time status updates and a comprehensive audit trail for compliance. * Real-time process monitoring * Comprehensive audit trails * Automated compliance reporting * Process optimization insights Learn more about business features in the [Business Features](/features). For Technical Teams Define workflows in plain JavaScript with intuitive helper methods. Every state transition is recorded as an immutable event, making debugging, re-execution, and auditing straightforward. * Code-first workflow definition * Event-sourced state management * OpenTelemetry integration * High concurrency support Explore technical documentation in the [Developer Guide](/guides). *** Event-Sourced Architecture Every action is recorded for complete replayability Our architecture captures every state transition as an event, enabling precise re-execution, detailed auditing, and a full historical record of every workflow’s execution. The engine’s event-sourced design ensures that each change—from creation to termination—is logged as an immutable event. Deterministic Workflow Execution Predictable and repeatable process flows The design of IdentityFlow guarantees that given the same input, workflows will always follow the same execution path. This determinism is a cornerstone for reliable replays and is vital for ensuring consistency in production environments. Performance Optimization Strategies High throughput with smart resource management The engine’s performance is enhanced by techniques such as output caching and lazy dependency injection, which reduce unnecessary processing and maximize resource efficiency. Audit Trail and Timestamping Immutable records with precise timestamps Each internal event is timestamped and logged with relevant metadata, creating a comprehensive audit trail that is critical for compliance and forensic analysis. Scalability and Parallelism Built to scale with concurrent workflows IdentityFlow is engineered to handle parallel execution and sub-workflow orchestration, ensuring that it can support high-volume, complex processes without sacrificing performance. Error Recovery and Resilience Automated error handling and fault tolerance The engine is designed to recover from failures gracefully, with built-in mechanisms for retrying failed actions, rolling back transactions, and ensuring that workflows can continue seamlessly. Scheduled Jobs Automate recurring tasks seamlessly An upcoming capability will allow workflows to run at predefined intervals, ensuring that scheduled jobs are executed exactly once per cycle. Time Travel Querying Query your event store as of any past point in time Review historical workflow states. This “time travel” feature simplifies auditing and forensic analysis. Transparent Audit Trails Immutable audit trail capturing every state change IdentityFlow makes it easy to verify process integrity and meet compliance standards. Seamless Integration Integrates easily with third-party APIs, databases, and legacy systems Our modular architecture and lazy-loaded dependency model streamline your automation efforts. *** # [Get Started with IdentityFlow](#get-started-with-identityflow) [Section titled “Get Started with IdentityFlow”](#get-started-with-identityflow) While IdentityFlow is not open source, we offer comprehensive demos and trial periods for qualified organizations. Contact us to learn how IdentityFlow can transform your workflow automation. [Request Demo Access](mailto:info@kenoxa.de) Our team will respond within 24 hours to schedule a personalized demo.