• 0

Prevent access to files using .htaccess


Question

Hi everyone,

I'm trying to use the .htaccess file to prevent type-in access to .js (javascript) files which are located in a /scripts folder

with:

<Files ~ "\.js$">
Order Deny,Allow
Deny from all
</Files>

I can block type-in access, but the pages cannot use the scripts. I add "Allow from 127.0.0.1" but no results.

I also tried:

Options +FollowSymlinks
RewriteEngine On

RewriteCond %{HTTP_REFERER} !^http://mydomain.* [NC]
RewriteRule .*\.(js)$ http://mydomain.com [R,L]

In this case, the scripts are functional, but the first time I type the location of .js file, I can get it open. Only when I refresh the page that I get redirected. I really can't understand why this happens...

I find this really useful to block hotlinks, but it's not always bloking type-in requests.

By the way, I have no access to apache configuration.

Any help would be appreciated.

Thanks

7 answers to this question

Recommended Posts

  • 0

The refer is the page/site that you came from to load the current page. When typing in the url, the refer i expect is going to be blank, however upon a refresh it seems that it is changed to the "previous" page displayed!

What your trying to do, afaik, isn't possible; however you go about trying to block access to the js files, they need to be sent to the browser, somehow, so it can use them, there's just no getting around it, however you do that, the code is going to have to be available to the browser for it to run it, and in doing so it automatically becomes accessible to the user too!

Think about it:

1) You could stick the js files in a dir outside of the web root, so they are inaccessible publicly, and then use a rewrite or php file to get to them, but that makes absolutely no difference...

2) You could encrypt them, and have a piece of unencrypted js that decrypts them and allows them to run; the average person couldn't see the code, but anyone with js skills could easily obtain a decrypted copy...

They have to be public one way or another, all you can do is make things a little more difficult, I'd just give up if I were you, it's not worth the effort ;)

  • 0

I utilize WHMCS which utilizes .TPL files.. The unfortunate part was that a user could access the filename.tpl file by typing it directly in (assuming they know the exact filename, which if you use the script, you know it..) Because of this when I was selling premium templates, some users were stealing it by simply manually going to all 40+ TPL files and seeing the hardcoded source..

To prevent them from access the TPLs BUT still allow them to be read by the server, I used the following:

 <Files "*.tpl">
Order Allow,Deny
Deny from All
</Files>

That sample tosses a 403 Forbidden when you access the file in any browser BUT the server can still access it. For example:

http://demo.mywhmcs.com/templates/portal/ -- This is a direct link to a template I'm not utilizing nor do I have it protected as it's a default template. Notice how you can see all TPL files (and others)

http://demo.mywhmcs.com/templates/macish -- This has an .htaccess file tossed into it preventing TPL files from being shown. Notice when you view http://demo.mywhmcs.com, you're able to view the site with NO issues despite the fact that everything is powered from TPL files.

BTW, ignore the **** design on there ;).

The reason why your snippet is not working is because the ORDER of the 1st line is crucial. You can read about it on apache.org. :)

Edit: Bah, I see it's for .js though--That's not possible, sorry! Same with not being able to do it with CSS, etc.

  • 0

As far as I know, if it needs to be downloaded to the client, you can't restrict direct access since that's how the browser will obtain it too, as theblazingangel said.

There are a couple of ways you could go about making it hard though:

  1. Try Google's "Closure" compiler which will optimize, and then obfuscate (as a result of optimization), your JavaScript and make it hard for humans to read. This is the easiest and probably the optimal solution.
  2. Try loading your scripts through a PHP file that detects whether or not the request was automated by the browser loading the script, or manual by a human trying to view it. I still don't know if its possible like this, but its perhaps worth looking into if you're THAT concerned.

  • 0

The key here is understanding the difference between server side and client side code. Server side code can - and should - be 100% hidden from the user. This includes PHP, .NET, and all that kind of stuff.

JavaScript, on the other hand, is like HTML and CSS. It needs to be read by the browser - and if the browser needs to read it, humans who are interested in reading it will be able to. There isn't a way to block access to JS files, because if humans can't access it, browsers won't either ;)

  • 0

Thank you all for the answers!

I thought there was some variable (like HTTP_REFERER) that saves the url from the address bar, so I could compare it, and if ends with .js or any other file i wish to prevent typed-in access, i redirect to an error page. If the request for the file came from the browser or the server, no action is taken.

This means i can't prevent typed-in access to .js .css .txt or image files that will be used by the browser?

  • 0
  On 30/05/2010 at 04:05, andressito said:

Thank you all for the answers!

I thought there was some variable (like HTTP_REFERER) that saves the url from the address bar, so I could compare it, and if ends with .js or any other file i wish to prevent typed-in access, i redirect to an error page. If the request for the file came from the browser or the server, no action is taken.

This means i can't prevent typed-in access to .js .css .txt or image files that will be used by the browser?

the referrer header tells you the page the user came from. in other words, the page they were previously on, not the current one.

it is extremely unreliable since it is easily spoofed (just like all browser headers), and very often is simply empty because a) the user didn't visit anything before your page; or b) more likely, their browser/security software is configured not to send any referrers, for privacy/security reasons.

for future reference, if you're writing anything that does anything at all, never rely on anything the user sends to you.

you can prevent type-in access to the files, but it's pointless since you will eventually be sending the whole content to the user anyways. say, for example, you put all your files in a directory that's forbidden to the public, and have a php script which reads the files and serves it back if a certain variable exists, and you set the variable in the index file. this way you've effectively prevented type-in access in the strict sense, but the php script is still giving the user an exact copy of the file. this is useful in some circumstances, but not very many, and it certainly won't protect the contents of your files. anything that you have to send to the user, can't be protected.

  On 31/05/2010 at 21:38, andressito said:

I can protect a folder or have a login check to a page, but I can't protect .js or image files from direct download, just hotlinks?

hotlink "protection" also relies on referrer headers, it's much more trouble than it's worth.

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.