You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
robots.txt is supposed to be automatically generated when production is set to true. This would then overwrite the default robots.txt file -- which disallows indexing. This is not working.
Running locally the robots.txt file is correctly generated. This isn't happening in the github action. Figure out where the workflow is messed up.
The text was updated successfully, but these errors were encountered:
I requested a recrawl, which will show any issues with the robots.txt file. But, given tools such as seomator seem
to think the robots.txt file is good and allows both Google and Bing to crawl I don't expect any issues.
Describe the bug
robots.txt is supposed to be automatically generated when production is set to true. This would then overwrite the default robots.txt file -- which disallows indexing. This is not working.
Running locally the robots.txt file is correctly generated. This isn't happening in the github action. Figure out where the workflow is messed up.
The text was updated successfully, but these errors were encountered: