Update: Freek Van der Herten has an updated version of this tutorial, which is better on memorey and supports larger file uploads.
Uploading to Amazon S3 can be a great way to keep your files loading quickly with their CDN, and allow you to work more on your code, rather than your filesystem.
Laravel 5’s new FileSystem makes this easy, but lacks a lot of documentation for how to actually accomplish this. First thing we need to do is tell our controller class to use the Filesystem contract and the Request class.
use Illuminate\Http\Request;
use Illuminate\Contracts\Filesystem\Filesystem;
Next, let’s setup a controller action to handle the uploaded file.
public function uploadFileToS3(Request $request)
{
$image = $request->file('image');
}
As you can see, the Request class has a specific file method for dealing with uploaded files. We assign the uploaded file to a variable, in this case $image.
Next we need to assign a file name to the uploaded file. You could leave this as the original filename, but in most cases you will want to change it to keep things consistent. Let’s change it to a timestamp, and append the file extension to it.
$imageFileName = time() . '.' . $image->getClientOriginalExtension();
Now we just need to create a new S3 Filesystem instance, define the path relative to our bucket, and upload the file. We will use the $s->put() method, and pass three perameters.
- Filepath relative to your bucket
- The contents of the file
- Permission of the file (optional)
$s3 = \Storage::disk('s3');
$filePath = '/support-tickets/' . $imageFileName;
$s3->put($filePath, file_get_contents($image), 'public');
That’s all there is to it. Remember to keep your API Key and Secret Key in your .env file. You don’t want those in your version control, so load them as an environment variable, like so:
's3' => [
'driver' => 's3',
'key' => env('S3_KEY'),
'secret' => env('S3_SECRET'),
'region' => env('S3_REGION'),
'bucket' => env('S3_BUCKET'),
],
Very nice and clear artical. I was looking for the solution every where but found uncleared everywhere. After following your steps , i solved the problem. You save the day.
Thanks a lot.
I didn’t know how to set files public. This was very helpful.
Thank you man!
This is good way to do it for small files. You should be aware that file_get_contents will load the entire file in memory. If you have large files (eg. over 10 MB) you should use streams instead. Here’s an example on how to that: https://github.com/spatie/laravel-medialibrary/blob/master/src/Filesystem.php#L55
I am allowing my users to upload images 10MB maximum, do I need to use streams in this case ?
I want to resize the images also and am trying it like this:
public function uploadFileToS3(Request $request) {
$image = Image::make($request->file(‘image’))->encode(‘jpg’, 75);
$s3 = Storage::disk(‘s3’);
$image_file_name = $this->generateName($request->name) . ‘.jpg’;
$file_path = ‘/’ . config(‘folder’) . ‘/’ . $request->name . ‘/’;
$s3->put($file_path.’original_’.$image_file_name, $image, ‘public’);
$s3->put($file_path.’medium_’.$image_file_name, $image->fit(300, 300), ‘public’);
$s3->put($file_path.’thumb_’.$image_file_name, $image->fit(100, 100), ‘public’);
return json_encode(array(
‘filename’ => $image_file_name
));
}
The only problem is that all images saved to S3 are of the same size. What am I doing wrong here?
@alvin
why dont you resize your image using the intervention package
$image = Image::make($request->file(‘image’)->resize(‘300′,’300’)->save(‘$filePathMedium’)->resize(‘100′,’100’)->save(‘$filePathThumb’);
I am getting a fatal error exception:
Class ‘League\Flysystem\AwsS3v2\AwsS3Adapter’ not found
Did you make sure you installed the package using composer first? You need to install the adapter on top of Laravel’s core to leverage S3.
I am quite new to this AWS usage and Laravel 5.2.. I have built a Class (not a Controller) with S3 functions and I am trying to upload images via the command line. So how do actually access this function if it’s not a Controller? And how do I give the Request to the function and what needs to be inside the Request?
Good article, thanks for your effort. I’m studying to switch to amazon s3.
But I think that in this way the user must wait twice the time. Is it correct?
have to install laravel-flysystem and aws-sdk-php-laravel packages. To upload files to s3 with laravel 5 you can follow these steps:
composer require aws/aws-sdk-php-laravel:~2.0
composer require graham-campbell/flysystem:~2.1
composer require league/flysystem-aws-s3-v2:~1.0
Everything is working fine for me. But I am not sure how to get the public accessible URL of the file that was uploaded.
Tried few ways searching over internet, But could not get the URL
$url = Storage::url(‘fileName.ext’);
In the ..blade.php
{!!Form::file(‘image’)!!}
In the Controller
public function store(Request $request)
{
……….
$imageFileName = “123.png”;
$s3 = \Storage::disk(‘s3’);
$filePath = ‘/mybucket/’ . $imageFileName;
$s3->put($filePath, file_get_contents($request->file(‘image’)), ‘public’);
…………
}
Why I have an error on
file_get_contents(): Filename cannot be empty
Same issue here. Please help me how to fix this.
Are you sure $request->file(‘image’) is not empty/null ?
Great article sir….
Hi i followed your tutorial and it works great thanks.
but i want to make my file private on s3 and only want to access it using signed urls i have tried private instead
of public but still i can access the files by simple amazon s3 url
$s3->put($filePath, file_get_contents($image), ‘private’);
i am using laravel 5.0 .
Cool staff here! Thanks for post
Nice Article. How it help to developer in terms of balance the day to day life.
Hi I want to get uploaded list file with secure url.when we upload it on aws server then it upload private and public both,but how to validate this url cannot access any third users.
Hi Friends Here is completed code for upload private file and retrive generated url with minutes and second expired.
upload file in aws s3 from laravel 5.3 or 5.4
Run: composer require league/flysystem-aws-s3-v3
set in config/filesystems.php
‘s3’ => [
‘driver’ => ‘s3’,
‘key’ => ‘your_generated_key’,
‘secret’ => ‘your_generated_secret’,
‘region’ => ‘us-east-1’,
‘bucket’ => ‘your_bucket_folder’,//folder will be public permission
],
run: php artisan config:cache
make controller: validate($request, [
‘image’ => ‘required|image|mimes:jpeg,png,jpg,gif,svg,mp4|max:2048’,
]);
$imageName = ‘SampleVideo1.’.$request->image->getClientOriginalExtension();
$image = $request->file(‘image’);
//$t = \Storage::disk(‘s3’)->put($imageName, file_get_contents($image), ‘public’);
$t = \Storage::disk(‘s3’)->put($imageName, file_get_contents($image), ‘private’);
$imageName = \Storage::disk(‘s3’)->url($imageName);
return back()
->with(‘success’,’Image Uploaded successfully.’)
->with(‘path’,$imageName);
}
public function imageUpload()
{
//this code for generate new signed url of your file
$value=”SampleVideo1.mp4″;
$disk = \Storage::disk(‘s3’);
if ($disk->exists($value))
{
$command = $disk->getDriver()->getAdapter()->getClient()->getCommand(‘GetObject’, [
‘Bucket’ => \Config::get(‘filesystems.disks.s3.bucket’),
‘Key’ => $value,
//’ResponseContentDisposition’ => ‘attachment;’//for download
]);
$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+10 minutes’);
//$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+15 seconds’);
$generate_url = $request->getUri();
echo $generate_url;
}
//this code for show form
return view(‘admin.image-upload’);
}
}
//Retrive generated url of private file or video (i have upload one video in aws server where store this name but not directly open because private permission
$value=”SampleVideo1.mp4″;
$disk = \Storage::disk(‘s3’);
if ($disk->exists($value))
{
$command = $disk->getDriver()->getAdapter()->getClient()->getCommand(‘GetObject’, [
‘Bucket’ => \Config::get(‘filesystems.disks.s3.bucket’),
‘Key’ => $value,
//’ResponseContentDisposition’ => ‘attachment;’//for download
]);
$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+10 minutes’);
//$request = $disk->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, ‘+15 seconds’);
$generate_url = $request->getUri();
echo $generate_url;
}
//now you can access your generated url
Great work…
Thank a lot. It save my day 🙂
How to retrieve the file information in the bucket?
Great tutorial thanks,Can you give me the solution for uploading a whole recursive directory to s3 ?
If i want to make the files privately accessible. What would be the better way to do?
Thanks a lot! Saved my time. Most helpful tutorial for s3.
I have created my first bucket, but I can’t find the credentials 🙁
Hello, this is my code example.
————————————————————————-
$user = \Auth::user()->student;
$file = array(‘file’ => Input::file(‘zip’));
$destinationPath = storage_path(‘task/’.$user->id);
$extension = Input::file(‘zip’)->getClientOriginalExtension();
$fileName = time().’.’.$extension;
print_r ($file[‘file’]);
$s3 = \Storage::disk(‘s3’);
$filePath = ‘/task/’ . $user->id . ‘/’. $fileName;
$s3->put($filePath, file_get_contents(Input::file(‘zip’)), ‘public’);
…..
———————————————————–
And I have an error like this.
———————————————————–
“file_get_contents(): Filename cannot be empty”
———————————————————–
Please help me.
Hi,
Thanks for the tutorial and its very useful. Btw, when I uploaded the file to S3, it’s syncronous or asyncronous?
Instead of this long and tiring process, why not use managed hosting platform, like Cloudways, to upload Laravel files on S3. This platform performs automatic backup on S3, without even need to connect S3 with your EC2 instance.
What is the secret key that i need to put in .env . from where do i get it .
Thanks
pravin
Hi Guys,
Nice article and in comments as well.
I have two queries for the s3 buckets
1. I want to check the folder exist or not before uploading image there
2. I want to check the file exist or not before uploading the image there
3. If image exist , then update image name and then upload to s3
Can anyone help how to achieve it?
Thanks in advance
Great Tutorial. Straight to the point. Clear and Concise. AND IT WORKS! Thanks.
Hi I have a error for while uploading image to s3 bucket
Missing required client configuration options: region: (string) A “region” configuration value is required for the “s3” service (e.g., “us-west-2”). A list of available public regions and endpoints can be found at http://docs.aws.amazon.com/general/latest/gr/rande.html
Error retrieving credentials from the instance profile metadata server. (Error creating resource: [message] fopen(http://169.254.169.254/latest/meta-data/iam/security-credentials/): failed to open stream: Connection timed ou
Im getting this errror while applied your code . Any solution ?
click here to investigate https://hydra2020gate.com