Limit downloads per ip, possible for user?

Richard G

Verified User
Joined
Jul 6, 2008
Messages
13,766
Location
Maastricht
We have some customers which have some big images or .pdf documents on their website.

Is it possible via .htaccess or some other easy way for a customer, so this can only be downloaded once or twice per ip address per day or per week or something like that?
 

Beside that, only a not-so-easy way with htaccess come into my mind, setting cookie with expire(max age) in htaccess, and mod_rewrite to file or errormessage, but dont ask me how... https://www.askapache.com/htaccess/htaccess-fresh/#Cookie_Manipulation_Tests_mod_rewrite
 
Instead of a direct link to the download it could be a script that tracks ip's and checks the list to see if its new or not. I know you are not a coder but that's the way I would do it if there is no other easy way.

The script would check a file for the ip address and epoch time it was listed.
If it didn't exist then it would record the ip and current epoch time and allow the download.
If it did exist then check the epoch time against the current epoch time to see if it has expired.
If it has expired then allow the download and replace the old epoch time with the new epoch time.
If it has not expired then redirect to an error page.

This way would not prevent somebody who is determined to download it. It just prevents casual downloading. To be more secure you would have to rename the file after each download.

Again this is just the way I would do it. I am sure there are other ways and some may be easier. But since you had not gotten many answers I thought I would just throw it out there.
 
Thank you both.
So in fact there is not an easy solution which the user can do.
First one is with a database but it's a html site, the second link is giving me a blank screen only.

I'm indeed no coder. But then maybe it's a good idea to suggest the customer to change his site to Wordpress.
It's now a html site which a customer build and when he died last year it was taken over by a foundation who is managing it now.

And in Wordpress I've seen some plugins which can do that and preventing direct file access is easy in WP if I'm not mistaken.

At first I thought there maybe was some easy solution like to block direct access so they have to click it in the website itself.
This would at least cost the bad guy one click more, but I think they are using a scripting solution now, so they fore sure can easily adjust that.

So WP might be best solution then.
 
The site seems to have some problems, yesterday it was fine. When i call it in the browser, i also now get blank page. When i go to another site and then back in browser, then it works. In Waybackmachine its here http://web.archive.org/web/20240120032843/https://www.askapache.com/htaccess/htaccess-fresh/
But beside that, i have no idea about working with cookies and even dont know if it would work at all in this way.
All other solutions which are coming in my mind are all downloadscripts or -systems or filesharingscripts which log the access by IP and time themself.
 
Back
Top