• 0

Prevent access to files using .htaccess


Question

Hi everyone,

I'm trying to use the .htaccess file to prevent type-in access to .js (javascript) files which are located in a /scripts folder

with:

<Files ~ "\.js$">
Order Deny,Allow
Deny from all
</Files>

I can block type-in access, but the pages cannot use the scripts. I add "Allow from 127.0.0.1" but no results.

I also tried:

Options +FollowSymlinks
RewriteEngine On

RewriteCond %{HTTP_REFERER} !^http://mydomain.* [NC]
RewriteRule .*\.(js)$ http://mydomain.com [R,L]

In this case, the scripts are functional, but the first time I type the location of .js file, I can get it open. Only when I refresh the page that I get redirected. I really can't understand why this happens...

I find this really useful to block hotlinks, but it's not always bloking type-in requests.

By the way, I have no access to apache configuration.

Any help would be appreciated.

Thanks

7 answers to this question

Recommended Posts

  • 0

The refer is the page/site that you came from to load the current page. When typing in the url, the refer i expect is going to be blank, however upon a refresh it seems that it is changed to the "previous" page displayed!

What your trying to do, afaik, isn't possible; however you go about trying to block access to the js files, they need to be sent to the browser, somehow, so it can use them, there's just no getting around it, however you do that, the code is going to have to be available to the browser for it to run it, and in doing so it automatically becomes accessible to the user too!

Think about it:

1) You could stick the js files in a dir outside of the web root, so they are inaccessible publicly, and then use a rewrite or php file to get to them, but that makes absolutely no difference...

2) You could encrypt them, and have a piece of unencrypted js that decrypts them and allows them to run; the average person couldn't see the code, but anyone with js skills could easily obtain a decrypted copy...

They have to be public one way or another, all you can do is make things a little more difficult, I'd just give up if I were you, it's not worth the effort ;)

  • 0

I utilize WHMCS which utilizes .TPL files.. The unfortunate part was that a user could access the filename.tpl file by typing it directly in (assuming they know the exact filename, which if you use the script, you know it..) Because of this when I was selling premium templates, some users were stealing it by simply manually going to all 40+ TPL files and seeing the hardcoded source..

To prevent them from access the TPLs BUT still allow them to be read by the server, I used the following:

 <Files "*.tpl">
Order Allow,Deny
Deny from All
</Files>

That sample tosses a 403 Forbidden when you access the file in any browser BUT the server can still access it. For example:

http://demo.mywhmcs.com/templates/portal/ -- This is a direct link to a template I'm not utilizing nor do I have it protected as it's a default template. Notice how you can see all TPL files (and others)

http://demo.mywhmcs.com/templates/macish -- This has an .htaccess file tossed into it preventing TPL files from being shown. Notice when you view http://demo.mywhmcs.com, you're able to view the site with NO issues despite the fact that everything is powered from TPL files.

BTW, ignore the **** design on there ;).

The reason why your snippet is not working is because the ORDER of the 1st line is crucial. You can read about it on apache.org. :)

Edit: Bah, I see it's for .js though--That's not possible, sorry! Same with not being able to do it with CSS, etc.

  • 0

As far as I know, if it needs to be downloaded to the client, you can't restrict direct access since that's how the browser will obtain it too, as theblazingangel said.

There are a couple of ways you could go about making it hard though:

  1. Try Google's "Closure" compiler which will optimize, and then obfuscate (as a result of optimization), your JavaScript and make it hard for humans to read. This is the easiest and probably the optimal solution.
  2. Try loading your scripts through a PHP file that detects whether or not the request was automated by the browser loading the script, or manual by a human trying to view it. I still don't know if its possible like this, but its perhaps worth looking into if you're THAT concerned.

  • 0

The key here is understanding the difference between server side and client side code. Server side code can - and should - be 100% hidden from the user. This includes PHP, .NET, and all that kind of stuff.

JavaScript, on the other hand, is like HTML and CSS. It needs to be read by the browser - and if the browser needs to read it, humans who are interested in reading it will be able to. There isn't a way to block access to JS files, because if humans can't access it, browsers won't either ;)

  • 0

Thank you all for the answers!

I thought there was some variable (like HTTP_REFERER) that saves the url from the address bar, so I could compare it, and if ends with .js or any other file i wish to prevent typed-in access, i redirect to an error page. If the request for the file came from the browser or the server, no action is taken.

This means i can't prevent typed-in access to .js .css .txt or image files that will be used by the browser?

  • 0
  On 30/05/2010 at 04:05, andressito said:

Thank you all for the answers!

I thought there was some variable (like HTTP_REFERER) that saves the url from the address bar, so I could compare it, and if ends with .js or any other file i wish to prevent typed-in access, i redirect to an error page. If the request for the file came from the browser or the server, no action is taken.

This means i can't prevent typed-in access to .js .css .txt or image files that will be used by the browser?

the referrer header tells you the page the user came from. in other words, the page they were previously on, not the current one.

it is extremely unreliable since it is easily spoofed (just like all browser headers), and very often is simply empty because a) the user didn't visit anything before your page; or b) more likely, their browser/security software is configured not to send any referrers, for privacy/security reasons.

for future reference, if you're writing anything that does anything at all, never rely on anything the user sends to you.

you can prevent type-in access to the files, but it's pointless since you will eventually be sending the whole content to the user anyways. say, for example, you put all your files in a directory that's forbidden to the public, and have a php script which reads the files and serves it back if a certain variable exists, and you set the variable in the index file. this way you've effectively prevented type-in access in the strict sense, but the php script is still giving the user an exact copy of the file. this is useful in some circumstances, but not very many, and it certainly won't protect the contents of your files. anything that you have to send to the user, can't be protected.

  On 31/05/2010 at 21:38, andressito said:

I can protect a folder or have a login check to a page, but I can't protect .js or image files from direct download, just hotlinks?

hotlink "protection" also relies on referrer headers, it's much more trouble than it's worth.

This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
  • Posts

    • Xbox June Update brings unsynced save management, publisher browsing, and more by Pulasthi Ariyasinghe June has been a busy month for Microsoft, bringing major software upgrades across its product stack. It even announced new hardware for its Xbox lineup, finally entering the handheld gaming space. The company's roundup for this month's new features has now been published, and it's touting a great deal of changes. To start off on the PC side, the Xbox app on the platform now has a section to browse by publishers. The company says that this will let players easily discover more games made by their favorite developers and franchises. Copilot for Gaming also landed in beta form recently, letting users ask AI for help when a game gets too difficult. It's only available for iOS and Android for now, with ROG Xbox Ally support coming later this year. Another feature that will hit the Xbox Ally is the new universal launcher feature for the Xbox app on PC. Microsoft just kicked off Xbox Insider testing for this functionality earlier today. Get all the details here. Over on consoles, the ability to hide system apps, pin favorites to the list, and reduce the number of tiles displayed are now available. Game Hubs also arrived as a fresh feature to easily display relevant information when selecting a game to play, offering data on player stats, achievements, friends currently playing, recent captures, available add-ons, events, and more. Double-tapping the play button will quick-launch the game instead. On both Xbox consoles and in the cloud, a new progress bar will now appear when a save has been left behind on a device in an offline state. "A new progress bar, device names, timestamps, and additional details are now displayed when you have previous game saves on another device in an unsynced state," says the company. Microsoft has also added mouse and keyboard controls as well as touch controls for more cloud games this month. These join the fresh additions that have landed on the 'Stream your own game' collection and the Retro Classics app. Check out the full lists on the announcement page here. On top of all this, Microsoft has also announced that Xbox will be at Gamescom this year. While no details have been announced yet, more announcements from Xbox Game Studios may happen at the major gaming event.
    • Hopefully this is a precurser to them linking other launchers to the Xbox console. With this current gen the Xbox has had dismal sales compared to the competition. If they did support Steam, Epic, Ubisoft Connect, etc etc they'd crush on the next gen battle.
    • My update. Didn't see much point in the top panel since global menu isn't there, so going with a win/kde layout now. Overall, I would say Gnome is a disappointment - it's been 15 years and you still have to rely on a bunch of extensions to get anything useful out of it. At the same time, the way Universal Blue / Bluefin is approaching the desktop feels like what Ubuntu should have started doing five years ago (no wonder the guy I learned about this from used to work for Canonical). Maybe I should have gone with Aurora (the KDE variant), or Bazzite with KDE, but I think I have Gnome where it works for me now.       
    • GOG Galaxy offers connecting of other launchers into theirs. And while idea is interesting, I always had issues with Steam that would just lose connection and just not work. Which was annoying.
    • I remember that during the earlier development of Windows 10 a big deal was made about the Recycle Bin icon(s). https://www.neowin.net/news/wi...pdated-icons-in-this-build/
  • Recent Achievements

    • Dedicated
      Camlann earned a badge
      Dedicated
    • Week One Done
      fredss earned a badge
      Week One Done
    • Dedicated
      fabioc earned a badge
      Dedicated
    • One Month Later
      GoForma earned a badge
      One Month Later
    • Week One Done
      GoForma earned a badge
      Week One Done
  • Popular Contributors

    1. 1
      +primortal
      650
    2. 2
      Michael Scrip
      225
    3. 3
      ATLien_0
      220
    4. 4
      +FloatingFatMan
      146
    5. 5
      Xenon
      136
  • Tell a friend

    Love Neowin? Tell a friend!