Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Working in a sub directory #16

Open
bleutzinn opened this issue Apr 1, 2019 · 8 comments
Open

Working in a sub directory #16

bleutzinn opened this issue Apr 1, 2019 · 8 comments

Comments

@bleutzinn
Copy link

In the first step of your installation instruction you state:
"Note: This framework does currently work in the web root (at / under your domain) only."

However, my experience so far (I'm not very far yet) is that the framework works well when installed in a sub directory like `https://example.com/sf-pfv/'.

Does this limitation still apply or am I heading for disaster at some point later on?

@ocram ocram added the question label Apr 1, 2019
@ocram
Copy link
Contributor

ocram commented Apr 1, 2019

Thanks, good question!

Are you using Apache or nginx?

If I remember correctly, the problems had to do with the routing rules in the included .htaccess and the public folder. So what we want is actually the following:

If a request comes in, the web server first checks the public directory and looks for the file there. So if the request is for GET /robots.txt, the server should first check if public/robots.txt exists. This is really helpful for static assets, so that the requests don’t have to be routed through PHP. If the file does not exist in public, the front controller of the framework (index.php) receives the request. And it has to receive the correct route being requested, i.e. /robots.txt.

So some of that did not work in subdirectories with Apache. Perhaps you’d want to test this and see where it goes wrong.

@bleutzinn
Copy link
Author

I'm using Apache. Thanks to your elaborate answer I came up with the following solution.

To allow the normal use of some specific files I added a rule similar to your rule to prevent rewriting files in the public directory.
I added it at the start of the "BEGIN ROUTING" section in the .htaccess file:

RewriteEngine On
	
	# Don't rewrite requests for these specific files
	RewriteRule ^robots\.txt$ - [L]

A request for GET https://example.com/my_sub_dir/robots.txt now simply returns that file to the browser. Any other file in the (other than robots.txt in this case) still gets redirected through the index.php script as usual.

@ocram
Copy link
Contributor

ocram commented Apr 2, 2019

Thanks, that looks good, but is obviously not what we want here as a general solution, because it is not as convenient, requiring you to register every single file you put into public/ in your .htaccess.

But as a workaround in your case, it should be fine. And you shouldn’t experience any other problems due to the subdirectory – except for those routing issues with the public/ directory.

@ocram ocram closed this as completed Apr 2, 2019
@bleutzinn
Copy link
Author

Apparently my explanation wasn't clear. I'm sorry for that.

The extra rule does not apply to the public directory. In the example it allows direct access to the robots.txt file in the site root of the website which may be in a sub directory of the domain webroot.

This is the complete section in the .htaccess file:

########## BEGIN ROUTING (https://github.com/delight-im/PHP-Router) ##########

<IfModule mod_rewrite.c>

	RewriteEngine On
	
	# Don't rewrite requests for these specific files
	RewriteRule ^robots\.txt$ - [L]

	# Don't rewrite requests for files in the 'public' directory
	RewriteRule ^(public)($|/) - [L]

	# For all other files first check if they exist in the 'public' directory
	RewriteCond %{DOCUMENT_ROOT}/public%{REQUEST_URI} -f
	RewriteRule ^ public%{REQUEST_URI} [L]

	# And let 'index.php' handle everything else
	RewriteRule . index.php [L]

</IfModule>

So, if you have a main domain example.com and a subdomain sf-pfv then the site root will be something like public_html/example.com/sf-pfv. This directory will then contain both the robots.txt file and the main entry point to the app index.php, plus all other files making up the app.

Including additional directly accessible files by adding a rule for each like this is acceptable in my opinion since it involves no more than a couple of files. Others I can think of besides the robots.txt are sitemap.xml and possibly a sitemap.xsl.

@ocram
Copy link
Contributor

ocram commented Apr 2, 2019

Thanks for the clarification, though that doesn’t make much of a difference. Instead of storing your static assets in /public/, you store them directly in /. Whatever location is chosen here, you need one extra rule in the .htaccess per file – as a whitelist – for that file to be accessible.

As a general-purpose solution for this framework, this doesn’t work because it isn’t as convenient as just putting files into /public/.

Including additional directly accessible files by adding a rule for each like this is acceptable in my opinion since it involves no more than a couple of files.

I understood that, and in this case, it’s perfectly fine to choose the solution you have decided for.

In many web applications, however, the developers add dozens or even hundreds of static assets into /public/, e.g. JavaScript files, CSS files, images, etc.

@bleutzinn
Copy link
Author

Okay. However I just don't understand why you want to advertise a limitation which can so easily be lifted.

@ocram
Copy link
Contributor

ocram commented Apr 2, 2019

Yes, not really sure about that, but we had other priorities than to make this work. Anyway, happy that this issue is now here and discusses the problem and a solution.

@ocram
Copy link
Contributor

ocram commented May 17, 2020

There’s now a preliminary workaround that addresses this issue here and the following limitation mentioned in the README:

Note: This framework does currently work in the web root (at / under your domain) only.

The workaround does not require the developer to list individual files in /.htaccess anymore. Instead, everything works as usual, just with a different set of rewrite rules. However, these rules have not been extensively tested yet, which is why this is only a workaround in “alpha” stage.

In /.htaccess, replace

# Don't rewrite requests for files in the 'public' directory
RewriteRule ^(public)($|/) - [L]

# For all other files first check if they exist in the 'public' directory
RewriteCond %{DOCUMENT_ROOT}/public%{REQUEST_URI} -f
RewriteRule ^ public%{REQUEST_URI} [L]

# And let 'index.php' handle everything else
RewriteRule . index.php [L]

with

# First check if a file exists in 'public'
RewriteCond %{REQUEST_URI}::$1 ^(.*?/)(.*)::\2
RewriteCond %{DOCUMENT_ROOT}%1public/%2 -f
RewriteRule ^(.*)$ public/$1 [END]

# Let 'index.php' handle everything else
RewriteRule . index.php [END,QSA]

There’s also a related issue in the Apache issue tracker now:
https://bz.apache.org/bugzilla/show_bug.cgi?id=64447

@ocram ocram reopened this May 17, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants