Upload Files to AWS S3 using Laravel

Update: Freek Van der Herten has an updated version of this tutorial, which is better on memorey and supports larger file uploads.

Uploading to Amazon S3 can be a great way to keep your files loading quickly with their CDN, and allow you to work more on your code, rather than your filesystem.

Laravel 5’s new FileSystem makes this easy, but lacks a lot of documentation for how to actually accomplish this. First thing we need to do is tell our controller class to use the Filesystem contract and the Request class.


use Illuminate\Http\Request;
use Illuminate\Contracts\Filesystem\Filesystem;

Next, let’s setup a controller action to handle the uploaded file.


public function uploadFileToS3(Request $request)
{
  $image = $request->file('image');
}

As you can see, the Request class has a specific file method for dealing with uploaded files. We assign the uploaded file to a variable, in this case $image.

Next we need to assign a file name to the uploaded file. You could leave this as the original filename, but in most cases you will want to change it to keep things consistent. Let’s change it to a timestamp, and append the file extension to it.


$imageFileName = time() . '.' . $image->getClientOriginalExtension();

Now we just need to create a new S3 Filesystem instance, define the path relative to our bucket, and upload the file. We will use the $s->put() method, and pass three perameters.

  1. Filepath relative to your bucket
  2. The contents of the file
  3. Permission of the file (optional)

$s3 = \Storage::disk('s3');
$filePath = '/support-tickets/' . $imageFileName;
$s3->put($filePath, file_get_contents($image), 'public');

That’s all there is to it. Remember to keep your API Key and Secret Key in your .env file. You don’t want those in your version control, so load them as an environment variable, like so:


's3' => [
			'driver' => 's3',
			'key'    => env('S3_KEY'),
			'secret' => env('S3_SECRET'),
			'region' => env('S3_REGION'),
			'bucket' => env('S3_BUCKET'),
		],

Comments

  1. Very nice and clear artical. I was looking for the solution every where but found uncleared everywhere. After following your steps , i solved the problem. You save the day.

    Thanks a lot.

  2. I want to resize the images also and am trying it like this:
    public function uploadFileToS3(Request $request) {
    $image = Image::make($request->file(‘image’))->encode(‘jpg’, 75);
    $s3 = Storage::disk(‘s3’);

    $image_file_name = $this->generateName($request->name) . ‘.jpg’;
    $file_path = ‘/’ . config(‘folder’) . ‘/’ . $request->name . ‘/’;

    $s3->put($file_path.’original_’.$image_file_name, $image, ‘public’);
    $s3->put($file_path.’medium_’.$image_file_name, $image->fit(300, 300), ‘public’);
    $s3->put($file_path.’thumb_’.$image_file_name, $image->fit(100, 100), ‘public’);

    return json_encode(array(
    ‘filename’ => $image_file_name
    ));
    }

    The only problem is that all images saved to S3 are of the same size. What am I doing wrong here?

    1. @alvin
      why dont you resize your image using the intervention package
      $image = Image::make($request->file(‘image’)->resize(‘300′,’300’)->save(‘$filePathMedium’)->resize(‘100′,’100’)->save(‘$filePathThumb’);

  3. I am quite new to this AWS usage and Laravel 5.2.. I have built a Class (not a Controller) with S3 functions and I am trying to upload images via the command line. So how do actually access this function if it’s not a Controller? And how do I give the Request to the function and what needs to be inside the Request?

  4. Good article, thanks for your effort. I’m studying to switch to amazon s3.

    But I think that in this way the user must wait twice the time. Is it correct?

  5. have to install laravel-flysystem and aws-sdk-php-laravel packages. To upload files to s3 with laravel 5 you can follow these steps:

    composer require aws/aws-sdk-php-laravel:~2.0
    composer require graham-campbell/flysystem:~2.1
    composer require league/flysystem-aws-s3-v2:~1.0

  6. Everything is working fine for me. But I am not sure how to get the public accessible URL of the file that was uploaded.

    Tried few ways searching over internet, But could not get the URL

  7. In the ..blade.php
    {!!Form::file(‘image’)!!}

    In the Controller

    public function store(Request $request)
    {
    ……….
    $imageFileName = “123.png”;
    $s3 = \Storage::disk(‘s3’);
    $filePath = ‘/mybucket/’ . $imageFileName;
    $s3->put($filePath, file_get_contents($request->file(‘image’)), ‘public’);
    …………
    }
    Why I have an error on

    file_get_contents(): Filename cannot be empty

  8. Hi i followed your tutorial and it works great thanks.
    but i want to make my file private on s3 and only want to access it using signed urls i have tried private instead
    of public but still i can access the files by simple amazon s3 url
    $s3->put($filePath, file_get_contents($image), ‘private’);
    i am using laravel 5.0 .

  9. Hi I want to get uploaded list file with secure url.when we upload it on aws server then it upload private and public both,but how to validate this url cannot access any third users.

  10. Hi Friends Here is completed code for upload private file and retrive generated url with minutes and second expired.

    upload file in aws s3 from laravel 5.3 or 5.4

    Run: composer require league/flysystem-aws-s3-v3
    set in config/filesystems.php

    ‘s3’ => [
    ‘driver’ => ‘s3’,
    ‘key’ => ‘your_generated_key’,
    ‘secret’ => ‘your_generated_secret’,
    ‘region’ => ‘us-east-1’,
    ‘bucket’ => ‘your_bucket_folder’,//folder will be public permission
    ],

    run: php artisan config:cache

    make controller: validate($request, [
    ‘image’ => ‘required|image|mimes:jpeg,png,jpg,gif,svg,mp4|max:2048’,
    ]);

    $imageName = ‘SampleVideo1.’.$request->image->getClientOriginalExtension();
    $image = $request->file(‘image’);
    //$t = \Storage::disk(‘s3’)->put($imageName, file_get_contents($image), ‘public’);
    $t = \Storage::disk(‘s3’)->put($imageName, file_get_contents($image), ‘private’);
    $imageName = \Storage::disk(‘s3’)->url($imageName);

    return back()
    ->with(‘success’,’Image Uploaded successfully.’)
    ->with(‘path’,$imageName);
    }
    public function imageUpload()
    {

    //this code for generate new signed url of your file
    $value=”SampleVideo1.mp4″;
    $disk = \Storage::disk(‘s3’);
    if ($disk->exists($value))
    {
    $command = $disk->getDriver()->getAdapter()->getClient()->getCommand(‘GetObject’, [
    ‘Bucket’ => \Config::get(‘filesystems.disks.s3.bucket’),
    ‘Key’ => $value,
    //’ResponseContentDisposition’ => ‘attachment;’//for download
    ]);

    $request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+10 minutes’);
    //$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+15 seconds’);

    $generate_url = $request->getUri();
    echo $generate_url;
    }

    //this code for show form
    return view(‘admin.image-upload’);

    }

    }

    //Retrive generated url of private file or video (i have upload one video in aws server where store this name but not directly open because private permission
    $value=”SampleVideo1.mp4″;
    $disk = \Storage::disk(‘s3’);
    if ($disk->exists($value))
    {
    $command = $disk->getDriver()->getAdapter()->getClient()->getCommand(‘GetObject’, [
    ‘Bucket’ => \Config::get(‘filesystems.disks.s3.bucket’),
    ‘Key’ => $value,
    //’ResponseContentDisposition’ => ‘attachment;’//for download
    ]);

    $request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+10 minutes’);
    //$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+15 seconds’);

    $generate_url = $request->getUri();
    echo $generate_url;
    }
    //now you can access your generated url

  11. Great tutorial thanks,Can you give me the solution for uploading a whole recursive directory to s3 ?

  12. Hello, this is my code example.
    ————————————————————————-
    $user = \Auth::user()->student;

    $file = array(‘file’ => Input::file(‘zip’));
    $destinationPath = storage_path(‘task/’.$user->id);
    $extension = Input::file(‘zip’)->getClientOriginalExtension();
    $fileName = time().’.’.$extension;

    print_r ($file[‘file’]);
    $s3 = \Storage::disk(‘s3’);
    $filePath = ‘/task/’ . $user->id . ‘/’. $fileName;
    $s3->put($filePath, file_get_contents(Input::file(‘zip’)), ‘public’);
    …..
    ———————————————————–
    And I have an error like this.
    ———————————————————–
    “file_get_contents(): Filename cannot be empty”
    ———————————————————–
    Please help me.

  13. Instead of this long and tiring process, why not use managed hosting platform, like Cloudways, to upload Laravel files on S3. This platform performs automatic backup on S3, without even need to connect S3 with your EC2 instance.

  14. Hi Guys,
    Nice article and in comments as well.
    I have two queries for the s3 buckets
    1. I want to check the folder exist or not before uploading image there
    2. I want to check the file exist or not before uploading the image there
    3. If image exist , then update image name and then upload to s3

    Can anyone help how to achieve it?

    Thanks in advance

Leave a Reply

Your email address will not be published. Required fields are marked *