Security Tip: Restricting Local File Access

[Tip#33] We can easily restrict access to files on remote storage like S3, but what about local files?

Security Tip: Restricting Local File Access

Greetings my friends! Firstly, welcome to 2023! 🥳 This week’s tip comes from a suggestion over on our Substack Chat, where a reader asked how to restrict access to files in local storage, rather than a remote storage like S3. There are a few ways to solve this, and today we’re going to cover a simple method I’ve used many times.

Who is going to Laracon EU? I’ll be there, speaking at my first in-person Laracon1! I have a very ambitious plan for my talk, and you’re not going to want to miss it live. But it’s going to take some co-ordination, so wish me luck! 🤞

💡 Ensure your apps are secure, book in a Laravel Security Audit and Penetration Test! 🕵️

Looking to learn more?
Security Tip #15: Be Careful Of Transliteration
▶️ In Depth #5: Rehashing Passwords

Restricting Local File Access

It’s trivial to restrict access to files on an external file storage system like S3. Just set the right permissions on the bucket and use the `Storage::temporaryUrl()` method. However, this doesn’t work for local files. Laravel provides a Public Disk, which provides easy access - but does not restrict it in any way.

One option is to use randomly generated filenames, but that’s just security through obscurity, and has no real benefit if someone can index or list the files. It also doesn’t allow you to define an expiry time. So it doesn’t really do the job.

To solve this problem, we need to use a bit of creativity and a couple of Laravel’s helpers.

Firstly, store the files outside the `public/`, so they aren’t directly accessible. A subfolder inside `storage/` is usually a good spot.

Secondly, create a new route to serve the files from, adding the filename as the final route parameter, and add the `signed` middleware to this route. (I’m sure you can see where this is going!)

Route::get('uploads/{file}', function (string $file) {
    return Storage::disk('uploads')->download($file);
    ->where(['file' => '\w[0-9a-zA-Z-_.]+'])

Finally, use the `URL::signedRoute()` and `URL::temporarySignedRoute()` helpers to generate a secure route to the file2.

The signed route passthrough will ensure that only URLs generated by your app can access the files, matching the functionality of the `Storage::temporaryUrl()` S3 method.

Have you solved this problem a different way? Leave a comment and share your solution!


  1. The example above will initiate a download of the file, so you’ll need to tweak it and possibly add some headers if you want to deal with images or other non-direct-download files. You could base this on requested extension, or have a different route. Whatever works for your application.
  2. There will be a performance hit on these requests, given they’ll need to boot up Laravel, rather than serve the file directly. If you need high performance handling, then local file storage probably isn’t what you’re looking for.
  3. Any time you’re dealing with signed requests, don’t forget to be careful of caching. Over-eager caching could result in the files being cached with the signature ignored - allowing directly access that bypasses the signature.

  1. I’ve presented at a number of Laracon Online’s, but will be my first in-person Laracon. I’m incredibly excited to finally meet Laravel folks for the first time!

  2. We dived into Signed Routes back in In Depth #9, so if you need a refresher, head over there.