Building Scalable Background Jobs with trigger.dev: A Complete Guide

Jonathan Wilke

Jonathan Wilke

7/12/2025

#cron#trigger.dev#nextjs#tutorial#saas

As your SaaS application grows, you'll inevitably encounter scenarios where certain tasks need to run in the background. Whether it's sending welcome emails to new users, processing image uploads, generating reports, or handling webhook notifications, these operations shouldn't block your main application flow.

In this comprehensive guide, we'll explore how to implement robust background job processing using trigger.dev - a powerful platform that makes it easy to build, deploy, and manage background jobs and scheduled tasks.

Note: For detailed setup instructions, check out our comprehensive documentation.

Why Background Jobs Matter

Before diving into the implementation, let's understand why background jobs are crucial for modern applications:

Performance Benefits

  • Non-blocking operations: Users don't wait for time-consuming tasks
  • Better user experience: Immediate response times for critical actions
  • Resource optimization: Efficient use of server resources

Scalability Advantages

  • Horizontal scaling: Distribute workload across multiple workers
  • Fault tolerance: Retry mechanisms for failed jobs
  • Queue management: Handle traffic spikes gracefully

Common Use Cases

  • Email notifications and newsletters
  • File processing and image optimization
  • Data analytics and reporting
  • Third-party API integrations
  • Database maintenance tasks
  • Webhook processing

What trigger.dev Offers

trigger.dev is a developer-friendly platform that provides a robust infrastructure for background job processing. It offers:

  • Type-safe job definitions with TypeScript
  • Built-in retry mechanisms and error handling
  • Cron scheduling for recurring tasks
  • Real-time monitoring and logging
  • Seamless integration with popular frameworks
  • Webhook triggers for event-driven jobs
  • Job chaining for complex workflows
  • Rate limiting and throttling capabilities

Basic Background Job Examples

Let's explore some practical examples of background jobs you can implement with trigger.dev.

Welcome Email Job

import { task } from "@trigger.dev/sdk/v3";
import { z } from "zod";
import { sendEmail } from "@repo/mail";
 
const welcomeEmailSchema = z.object({
  userId: z.string(),
  email: z.string().email(),
  name: z.string(),
});
 
export const welcomeEmailTask = task({
  id: "welcome-email",
  schema: welcomeEmailSchema,
  run: async (payload) => {
    const { userId, email, name } = payload;
    
    try {
      await sendEmail({
        to: email,
        subject: "Welcome to Our Platform!",
        template: "WelcomeEmail",
        data: {
          name,
          userId,
        },
      });
      
      return {
        success: true,
        message: `Welcome email sent to ${email}`,
      };
    } catch (error) {
      throw new Error(`Failed to send welcome email: ${error.message}`);
    }
  },
});

Image Processing Job

import { task } from "@trigger.dev/sdk/v3";
import { z } from "zod";
import sharp from "sharp";
 
const imageProcessingSchema = z.object({
  imageUrl: z.string().url(),
  userId: z.string(),
  sizes: z.array(z.object({
    width: z.number(),
    height: z.number(),
    suffix: z.string(),
  })),
});
 
export const imageProcessingTask = task({
  id: "image-processing",
  schema: imageProcessingSchema,
  run: async (payload) => {
    const { imageUrl, userId, sizes } = payload;
    
    // Download the original image
    const response = await fetch(imageUrl);
    const imageBuffer = await response.arrayBuffer();
    
    // Process each size
    const processedImages = await Promise.all(
      sizes.map(async ({ width, height, suffix }) => {
        const processed = await sharp(Buffer.from(imageBuffer))
          .resize(width, height, { fit: 'cover' })
          .webp({ quality: 80 })
          .toBuffer();
        
        // Upload to your storage (e.g., S3, Cloudinary)
        const uploadUrl = await uploadToStorage(processed, `${userId}-${suffix}.webp`);
        
        return { size: `${width}x${height}`, url: uploadUrl };
      })
    );
    
    return {
      success: true,
      processedImages,
      userId,
    };
  },
});

Data Export Job

import { task } from "@trigger.dev/sdk/v3";
import { z } from "zod";
import { db } from "@repo/database";
import { sendEmail } from "@repo/mail";
 
const dataExportSchema = z.object({
  userId: z.string(),
  exportType: z.enum(["users", "orders", "analytics"]),
  dateRange: z.object({
    start: z.string(),
    end: z.string(),
  }),
  email: z.string().email(),
});
 
export const dataExportTask = task({
  id: "data-export",
  schema: dataExportSchema,
  run: async (payload) => {
    const { userId, exportType, dateRange, email } = payload;
    
    let data;
    switch (exportType) {
      case "users":
        data = await db.user.findMany({
          where: {
            createdAt: {
              gte: new Date(dateRange.start),
              lte: new Date(dateRange.end),
            },
          },
        });
        break;
      case "orders":
        data = await db.order.findMany({
          where: {
            createdAt: {
              gte: new Date(dateRange.start),
              lte: new Date(dateRange.end),
            },
          },
          include: { user: true, items: true },
        });
        break;
      case "analytics":
        data = await generateAnalyticsReport(dateRange);
        break;
    }
    
    // Generate CSV file
    const csvContent = generateCSV(data);
    const fileUrl = await uploadFile(csvContent, `${exportType}-${Date.now()}.csv`);
    
    // Send email with download link
    await sendEmail({
      to: email,
      subject: `Your ${exportType} export is ready`,
      template: "DataExport",
      data: {
        exportType,
        fileUrl,
        dateRange,
      },
    });
    
    return {
      success: true,
      fileUrl,
      recordCount: data.length,
    };
  },
});

Cron Jobs and Scheduled Tasks

trigger.dev makes it incredibly easy to schedule recurring tasks using cron expressions. Here are some practical examples:

Daily Database Cleanup

import { cron } from "@trigger.dev/sdk/v3";
import { db } from "@repo/database";
 
export const dailyCleanupTask = cron({
  id: "daily-cleanup",
  cron: "0 2 * * *", // Run at 2 AM daily
  run: async () => {
    console.log("Starting daily cleanup...");
    
    // Clean up old sessions
    const deletedSessions = await db.session.deleteMany({
      where: {
        expiresAt: {
          lt: new Date(),
        },
      },
    });
    
    // Clean up expired verification tokens
    const deletedTokens = await db.verificationToken.deleteMany({
      where: {
        expires: {
          lt: new Date(),
        },
      },
    });
    
    // Clean up old log files
    const deletedLogs = await db.log.deleteMany({
      where: {
        createdAt: {
          lt: new Date(Date.now() - 30 * 24 * 60 * 60 * 1000), // 30 days ago
        },
      },
    });
    
    return {
      deletedSessions: deletedSessions.count,
      deletedTokens: deletedTokens.count,
      deletedLogs: deletedLogs.count,
      timestamp: new Date().toISOString(),
    };
  },
});

Weekly Analytics Report

import { cron } from "@trigger.dev/sdk/v3";
import { db } from "@repo/database";
import { sendEmail } from "@repo/mail";
 
export const weeklyAnalyticsTask = cron({
  id: "weekly-analytics",
  cron: "0 9 * * 1", // Run at 9 AM every Monday
  run: async () => {
    const startOfWeek = new Date();
    startOfWeek.setDate(startOfWeek.getDate() - 7);
    
    // Generate analytics data
    const newUsers = await db.user.count({
      where: {
        createdAt: {
          gte: startOfWeek,
        },
      },
    });
    
    const activeUsers = await db.user.count({
      where: {
        lastLoginAt: {
          gte: startOfWeek,
        },
      },
    });
    
    const totalRevenue = await db.order.aggregate({
      where: {
        createdAt: {
          gte: startOfWeek,
        },
        status: "completed",
      },
      _sum: {
        amount: true,
      },
    });
    
    const topProducts = await db.orderItem.groupBy({
      by: ["productId"],
      where: {
        order: {
          createdAt: {
            gte: startOfWeek,
          },
        },
      },
      _sum: {
        quantity: true,
      },
      orderBy: {
        _sum: {
          quantity: "desc",
        },
      },
      take: 5,
    });
    
    // Send report to admin
    await sendEmail({
      to: "admin@yourcompany.com",
      subject: "Weekly Analytics Report",
      template: "AnalyticsReport",
      data: {
        newUsers,
        activeUsers,
        totalRevenue: totalRevenue._sum.amount || 0,
        topProducts,
        period: "Last 7 days",
      },
    });
    
    return {
      newUsers,
      activeUsers,
      totalRevenue: totalRevenue._sum.amount || 0,
      reportSent: true,
    };
  },
});

Monthly Billing Reconciliation

import { cron } from "@trigger.dev/sdk/v3";
import { db } from "@repo/database";
 
export const monthlyBillingTask = cron({
  id: "monthly-billing",
  cron: "0 1 1 * *", // Run at 1 AM on the 1st of every month
  run: async () => {
    const lastMonth = new Date();
    lastMonth.setMonth(lastMonth.getMonth() - 1);
    
    // Get all active subscriptions
    const subscriptions = await db.subscription.findMany({
      where: {
        status: "active",
        nextBillingDate: {
          gte: lastMonth,
          lt: new Date(),
        },
      },
      include: {
        user: true,
        plan: true,
      },
    });
    
    let successfulCharges = 0;
    let failedCharges = 0;
    let totalRevenue = 0;
    
    for (const subscription of subscriptions) {
      try {
        // Process payment
        const payment = await processPayment({
          customerId: subscription.user.stripeCustomerId,
          amount: subscription.plan.price,
          description: `Monthly subscription - ${subscription.plan.name}`,
        });
        
        if (payment.success) {
          successfulCharges++;
          totalRevenue += subscription.plan.price;
          
          // Update subscription
          await db.subscription.update({
            where: { id: subscription.id },
            data: {
              lastBillingDate: new Date(),
              nextBillingDate: new Date(Date.now() + 30 * 24 * 60 * 60 * 1000),
            },
          });
        } else {
          failedCharges++;
          
          // Handle failed payment
          await handleFailedPayment(subscription);
        }
      } catch (error) {
        failedCharges++;
        console.error(`Failed to process payment for subscription ${subscription.id}:`, error);
      }
    }
    
    return {
      successfulCharges,
      failedCharges,
      totalRevenue,
      processedSubscriptions: subscriptions.length,
    };
  },
});

Hourly Health Check

import { cron } from "@trigger.dev/sdk/v3";
 
export const healthCheckTask = cron({
  id: "health-check",
  cron: "0 * * * *", // Run every hour
  run: async () => {
    const checks = [];
    
    // Check database connectivity
    try {
      await db.$queryRaw`SELECT 1`;
      checks.push({ name: "database", status: "healthy" });
    } catch (error) {
      checks.push({ name: "database", status: "unhealthy", error: error.message });
    }
    
    // Check external APIs
    try {
      const response = await fetch("https://api.stripe.com/v1/account", {
        headers: { Authorization: `Bearer ${process.env.STRIPE_SECRET_KEY}` },
      });
      checks.push({ name: "stripe", status: response.ok ? "healthy" : "unhealthy" });
    } catch (error) {
      checks.push({ name: "stripe", status: "unhealthy", error: error.message });
    }
    
    // Check email service
    try {
      // Test email service connectivity
      await testEmailService();
      checks.push({ name: "email", status: "healthy" });
    } catch (error) {
      checks.push({ name: "email", status: "unhealthy", error: error.message });
    }
    
    // Send alert if any service is unhealthy
    const unhealthyChecks = checks.filter(check => check.status === "unhealthy");
    if (unhealthyChecks.length > 0) {
      await sendAlert({
        type: "health_check_failed",
        services: unhealthyChecks,
        timestamp: new Date().toISOString(),
      });
    }
    
    return {
      timestamp: new Date().toISOString(),
      checks,
      healthy: unhealthyChecks.length === 0,
    };
  },
});

Advanced Job Patterns

Job Chaining

import { task } from "@trigger.dev/sdk/v3";
import { processPaymentTask } from "./process-payment";
import { sendConfirmationTask } from "./send-confirmation";
import { updateInventoryTask } from "./update-inventory";
 
export const orderProcessingTask = task({
  id: "order-processing",
  run: async (payload) => {
    // Step 1: Process payment
    const paymentResult = await processPaymentTask.trigger({
      orderId: payload.orderId,
      amount: payload.amount,
    });
    
    if (!paymentResult.success) {
      throw new Error("Payment processing failed");
    }
    
    // Step 2: Update inventory
    await updateInventoryTask.trigger({
      orderId: payload.orderId,
      items: payload.items,
    });
    
    // Step 3: Send confirmation
    await sendConfirmationTask.trigger({
      orderId: payload.orderId,
      customerEmail: payload.customerEmail,
    });
    
    return { success: true, orderId: payload.orderId };
  },
});

Conditional Job Execution

import { task } from "@trigger.dev/sdk/v3";
import { db } from "@repo/database";
 
export const smartNotificationTask = task({
  id: "smart-notification",
  run: async (payload) => {
    const { userId, eventType } = payload;
    
    // Get user preferences
    const user = await db.user.findUnique({
      where: { id: userId },
      include: { preferences: true },
    });
    
    // Check if user wants this type of notification
    if (!user.preferences[eventType]) {
      return { skipped: true, reason: "User preference disabled" };
    }
    
    // Send notification based on user's preferred channel
    switch (user.preferences.channel) {
      case "email":
        await sendEmailNotification(user, eventType);
        break;
      case "sms":
        await sendSMSNotification(user, eventType);
        break;
      case "push":
        await sendPushNotification(user, eventType);
        break;
    }
    
    return { success: true, channel: user.preferences.channel };
  },
});

Batch Processing

import { task } from "@trigger.dev/sdk/v3";
import { db } from "@repo/database";
 
export const batchEmailTask = task({
  id: "batch-email",
  run: async (payload) => {
    const { campaignId } = payload;
    
    // Process emails in batches of 100
    const batchSize = 100;
    let processed = 0;
    
    while (true) {
      const batch = await db.subscriber.findMany({
        where: { campaignId, emailSent: false },
        take: batchSize,
      });
      
      if (batch.length === 0) break;
      
      // Send emails in parallel
      await Promise.all(
        batch.map(subscriber =>
          sendEmail({
            to: subscriber.email,
            subject: "Newsletter",
            template: "Newsletter",
          })
        )
      );
      
      // Mark as sent
      await db.subscriber.updateMany({
        where: { id: { in: batch.map(s => s.id) } },
        data: { emailSent: true },
      });
      
      processed += batch.length;
    }
    
    return { processed };
  },
});

Best Practices

1. Task Design Principles

  • Keep tasks focused: Each task should do one thing well
  • Make tasks idempotent: Tasks should be safe to run multiple times
  • Handle errors gracefully: Always include proper error handling
  • Use meaningful task IDs: Make debugging easier with descriptive names

2. Cron Job Best Practices

  • Choose appropriate timing: Avoid peak hours for resource-intensive jobs
  • Handle timezone correctly: Be explicit about timezone handling
  • Monitor execution times: Ensure jobs complete within expected timeframes
  • Implement proper logging: Track job execution and results
  • Plan for failures: Have fallback mechanisms for critical jobs

3. Security Considerations

  • Validate inputs: Always validate task payloads
  • Use environment variables: Never hardcode sensitive data
  • Implement rate limiting: Prevent abuse of your task system
  • Monitor access: Keep track of who can trigger tasks

4. Monitoring and Observability

  • Log everything: Include relevant context in logs
  • Set up alerts: Get notified of task failures
  • Track metrics: Monitor task performance and success rates
  • Use structured logging: Make logs searchable and analyzable

Conclusion

Background jobs are essential for building scalable, user-friendly applications. With trigger.dev, you get a powerful, developer-friendly platform that handles the complexity of job queuing, scheduling, and execution while providing excellent tooling for development and monitoring.

The patterns and examples in this guide should give you a solid foundation for implementing background jobs in your Next.js applications. Remember to:

  • Start simple and iterate
  • Test thoroughly in development
  • Monitor your jobs in production
  • Follow the best practices outlined above

For detailed instructions on how to set up trigger.dev in your supastarter application, check out our comprehensive documentation.

Happy coding!

Start your scalable and production-ready SaaS today

Save endless hours of development time and focus on what's important for your customers with our SaaS starter kits for Next.js, Nuxt 3 and SvelteKit

Get started

Stay up to date

Sign up for our newsletter and we will keep you updated on everything going on with supastarter.