Security Tip: Restricting Local File Access
[Tip#33] We can easily restrict access to files on remote storage like S3, but what about local files?
Greetings my friends! Firstly, welcome to 2023! 🥳 This week’s tip comes from a suggestion over on our Substack Chat, where a reader asked how to restrict access to files in local storage, rather than a remote storage like S3. There are a few ways to solve this, and today we’re going to cover a simple method I’ve used many times.
Who is going to Laracon EU? I’ll be there, speaking at my first in-person Laracon! I have a very ambitious plan for my talk, and you’re not going to want to miss it live. But it’s going to take some co-ordination, so wish me luck! 🤞
💡 Ensure your apps are secure, book in a Laravel Security Audit and Penetration Test! 🕵️
Looking to learn more?
⏩ Security Tip #15: Be Careful Of Transliteration
▶️ In Depth #5: Rehashing Passwords
Restricting Local File Access
It’s trivial to restrict access to files on an external file storage system like S3. Just set the right permissions on the bucket and use the `Storage::temporaryUrl()`
method. However, this doesn’t work for local files. Laravel provides a Public Disk, which provides easy access - but does not restrict it in any way.
One option is to use randomly generated filenames, but that’s just security through obscurity, and has no real benefit if someone can index or list the files. It also doesn’t allow you to define an expiry time. So it doesn’t really do the job.
To solve this problem, we need to use a bit of creativity and a couple of Laravel’s helpers.
Firstly, store the files outside the `public/`
, so they aren’t directly accessible. A subfolder inside `storage/`
is usually a good spot.
Secondly, create a new route to serve the files from, adding the filename as the final route parameter, and add the `signed`
middleware to this route. (I’m sure you can see where this is going!)
Route::get('uploads/{file}', function (string $file) {
return Storage::disk('uploads')->download($file);
})
->where(['file' => '\w[0-9a-zA-Z-_.]+'])
->name('uploads.download')
->middleware('signed');
Finally, use the `URL::signedRoute()`
and `URL::temporarySignedRoute()`
helpers to generate a secure route to the file
The signed route passthrough will ensure that only URLs generated by your app can access the files, matching the functionality of the `Storage::temporaryUrl()`
S3 method.
Have you solved this problem a different way? Leave a comment and share your solution!
Notes:
The example above will initiate a download of the file, so you’ll need to tweak it and possibly add some headers if you want to deal with images or other non-direct-download files. You could base this on requested extension, or have a different route. Whatever works for your application.
There will be a performance hit on these requests, given they’ll need to boot up Laravel, rather than serve the file directly. If you need high performance handling, then local file storage probably isn’t what you’re looking for.
Any time you’re dealing with signed requests, don’t forget to be careful of caching. Over-eager caching could result in the files being cached with the signature ignored - allowing directly access that bypasses the signature.
I’ve presented at a number of Laracon Online’s, but will be my first in-person Laracon. I’m incredibly excited to finally meet Laravel folks for the first time!
We dived into Signed Routes back in In Depth #9, so if you need a refresher, head over there.
How likely is the risk of someone listing or indexing the files? Since a signed URL can't be revoked (without rotating the app key), isn't using that vs. a long randomly generated filename (that can be changed when needed) quite equivalent from a security perspective? As you mention, the performance aspect can be important and loading files via Laravel (rather than something like nginx), could even add a DOS-vector.
Signed URLs with expiration time definitely add to security, but that could be used even if the files are placed in the public folder. Of course, I also assume the aspect of executing uploaded files as code (e.g. uploading php files) is taken care of.
And just to be clear, I'd recommend not using the local driver at all, and instead use a third party file storage solution as mentioned in the article.
Thanks for this post Stephen, really helpful !
I did something similar. However, I needed to display the user uploaded images to the user who uploaded them only. So here is how I approached it:
1) Created a storage disk outside of the public folder, and a standard crud controller methods for the user to upload one image to that disk.
2) Created a route that hits a controller action, to show the image
3) used this route in the src attribute of the image tag
4) Protected the route with auth middleware
5) in the controller method, placed a guard clause that checks the authenticated username against the username that is passed in as a route parameter
6) if the check fails then it aborts with a 404
7) if passes then returns the image as a response
Here is a gist:
https://gist.github.com/colbyalbo/27153a3ce0f8b1582fae8d2ca1811e8a
This definitely has a performance hit, but this will page will be accessed infrequently, and will only allow them to have one secure image at a time; I did experiment with allowing the user to upload 10 images. And When I viewed the screen that loads all of the images, the performance hit was definitely noticeable. So, my solution wouldn't be good for multiple images.
Happy New Years & Thanks!