Home Arrow Icon Knowledge base Arrow Icon Global Arrow Icon How do I pass a DTO through a Laravel pipeline
    Laravel pipeline Data Transfer Object DTO Laravel jobs SiteCrawlerTransporter Symfony DomCrawler data processing Laravel tutorial


How do I pass a DTO through a Laravel pipeline


To pass a Data Transfer Object (DTO) through a Laravel pipeline, you can follow these steps:

1. Define the DTO Class:
- Create a DTO class to encapsulate the data required by the pipeline. For example, `SiteCrawlerTransporter`:

php
     namespace App\Transporters;

     use App\Models\Site;
     use Symfony\Component\DomCrawler\Crawler;

     class SiteCrawlerTransporter
     {
         public function __construct(public Site $site, public Crawler $crawler)
         {
             // Constructor logic
         }
     }
     

2. Dispatch the Job:
- In your job class, dispatch the job with the required parameters:

php
     namespace App\Jobs;

     use App\Models\Site;
     use Illuminate\Bus\Queueable;
     use Illuminate\Contracts\Queue\ShouldQueue;
     use Illuminate\Foundation\Bus\Dispatchable;
     use Illuminate\Queue\InteractsWithQueue;
     use Illuminate\Queue\SerializesModels;
     use Illuminate\Support\Facades\Http;
     use Illuminate\Support\Facades\Log;
     use Illuminate\Support\Facades\Pipeline;
     use Symfony\Component\DomCrawler\Crawler;

     class SiteCrawlerJob implements ShouldQueue
     {
         use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

         public function __construct(public Site $site)
         {
             // Constructor logic
         }

         public function handle(): void
         {
             // Job logic
             Pipeline::send(new SiteCrawlerTransporter($site, new Crawler(Http::get($site->domain)->body())))
                 ->through([
                     function (SiteCrawlerTransporter $payload, Closure $next) {
                         // Process the payload
                         return $next($payload);
                     },
                 ])
                 ->thenReturn();
         }
     }
     

3. Using Pipelines:
- You can use Laravel's pipeline feature to process data through a series of tasks. For example, you can create a pipeline to crawl a site:

php
     namespace App\Jobs;

     use Illuminate\Bus\Queueable;
     use Illuminate\Contracts\Queue\ShouldQueue;
     use Illuminate\Foundation\Bus\Dispatchable;
     use Illuminate\Queue\InteractsWithQueue;
     use Illuminate\Queue\SerializesModels;
     use Illuminate\Support\Facades\Pipeline;
     use Symfony\Component\DomCrawler\Crawler;

     class SiteCrawlerJob implements ShouldQueue
     {
         use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;

         public function __construct(public Site $site)
         {
             // Constructor logic
         }

         public function handle(): void
         {
             // Pipeline logic
             Pipeline::send(new SiteCrawlerTransporter($site, new Crawler(Http::get($site->domain)->body())))
                 ->through([
                     function (SiteCrawlerTransporter $payload, Closure $next) {
                         // Process the payload
                         return $next($payload);
                     },
                 ])
                 ->thenReturn();
         }
     }
     

By following these steps, you can effectively pass a DTO through a Laravel pipeline and manage complex data processing tasks efficiently.

Citations:
[1] https://packagist.org/packages/corbosman/laravel-pipeline-passable
[2] https://atlas.dev/blog/maintaining-complexity-using-laravel-pipelines
[3] https://www.luckymedia.dev/blog/laravel-pipelines-sending-multiple-parameters
[4] https://dev.to/zubairmohsin33/data-transfer-object-dto-in-laravel-with-php7-4-typed-properties-2hi9
[5] https://github.com/spatie/laravel-data/discussions/547