Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

added code for creating robots.txt #838

Closed
wants to merge 4 commits into from

Conversation

rskbansal
Copy link

@rskbansal rskbansal commented Oct 27, 2022

Description

Previously, the code was written so as to return text data when localhost/robots.txt is visited by providing the -r or the --robots argument. Now, if the above mentioned arguments are passed, it creates a robots.txt file in the root directory of the server (either ./ or ./public).

Relevant issues

Fixes #825

Contributor checklist

  • Provide tests for the changes (unless documentation-only)
  • Documented any new features, CLI switches, etc. (if applicable)
    • Server --help output
    • README.md
    • doc/http-server.1 (use the same format as other entries)
  • The pull request is being made against the master branch

Maintainer checklist

  • Assign a version triage tag
  • Approve tests if applicable

@rskbansal rskbansal marked this pull request as ready for review October 27, 2022 22:30
@rskbansal
Copy link
Author

@thornjad

@BigBlueHat
Copy link
Member

@rskbansal not sure we want an http-server writing files. What's the motivation for the change?

@rskbansal
Copy link
Author

A robots.txt file shall be physically present on the server, that's what it is meant to be

@zbynek
Copy link
Contributor

zbynek commented Jun 2, 2023

file shall be physically present

but why? For the user it's only important that the response for localhost/robots.txt contains the data, not that the file exists. The current implementation looks fine (except for the === true check mentioned in #825)

@BigBlueHat
Copy link
Member

A robots.txt file shall be physically present on the server, that's what it is meant to be

If the file needs to be physically present for your use case, I'd recommend generating it prior to hosting and just not using the --robots option.

@zbynek thinks for connecting to #825. That does need a fix.

@BigBlueHat BigBlueHat closed this Jun 2, 2023
@BigBlueHat BigBlueHat added the out-of-scope Not part of the purpose and/or responsibility of this project label Jun 2, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
out-of-scope Not part of the purpose and/or responsibility of this project
Projects
None yet
Development

Successfully merging this pull request may close these issues.

robots.txt flag is not working
3 participants