Skip to content

Commit

Permalink
way better with the PR action
Browse files Browse the repository at this point in the history
  • Loading branch information
hobgoblina authored Apr 11, 2024
1 parent 091371b commit c5aba35
Show file tree
Hide file tree
Showing 2 changed files with 35 additions and 67 deletions.
80 changes: 13 additions & 67 deletions .github/workflows/update-robots-txt.yml
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
name: Update robots.txt

permissions: write-all

on:
schedule:
- cron: '0 0 * * 0'
Expand All @@ -11,78 +9,26 @@ jobs:
update-robots:
runs-on: ubuntu-latest
steps:
- name: Create Branch
env:
GITHUB_TOKEN: ${{ github.token }}
uses: peterjgrainger/[email protected]
with:
branch: robots.txt-update
- name: Checkout
uses: actions/checkout@v3
with:
ref: robots.txt-update
fetch-depth: 0
- name: Update robot.txt
id: update
env:
API_KEY: ${{ secrets.ROBOTS }}
run: |
'# _---~~(~~-_.
# _{ ) )
# , ) -~~- ( ,-' )_
# ( `-,_..`., )-- '_,)
# ( ` _) ( -~( -_ `, }
# (_- _ ~_-~~~~`, ,' )
# `~ -^( __;-,((()))
# ~~~~ {_ -_(())
# `\ }
# { }
# BRAAAAAAAAIIIINNSSSSSSS
' >> public/robots.txt
cp robots-base.txt public/robots.txt
curl --location 'https://api.darkvisitors.com/robots-txts' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer $API_KEY' \
--data '{ "agent_types": [ "AI Data Scraper", "AI Assistant", "AI Search Crawler" ], "disallow": "/" }' > public/robots.txt
git add public/robots.txt
changes=$(git push origin 2>&1)
if [ "$changes" = "Everything up-to-date" ]; then
echo "skip=true" >> "$GITHUB_OUTPUT"
fi
- name: Check if PR exists
env:
GH_TOKEN: ${{ github.token }}
id: check
run: |
prs=$(gh pr list \
--repo "$GITHUB_REPOSITORY" \
--json baseRefName,headRefName \
--jq '
map(select(.baseRefName == "dev" and .headRefName == "robots.txt-update"))
| length
')
if ((prs > 0)); then
echo "skip=true" >> "$GITHUB_OUTPUT"
fi
- name: Create Pull Request
if: |
!steps.check.outputs.skip &&
!steps.update.outputs.skip
uses: actions/github-script@v6
--header "Authorization: Bearer $API_KEY" \
--data '{ "agent_types": [ "AI Data Scraper", "AI Assistant", "AI Search Crawler" ], "disallow": "/" }' >> public/robots.txt
- name: Create pull request
uses: peter-evans/create-pull-request@v6
with:
script: |
const { repo, owner } = context.repo;
const result = await github.rest.pulls.create({
title: 'Update robots.txt',
owner,
repo,
head: 'robots.txt-update',
base: 'dev',
body: 'This PR was *auto-generated* by the `Update robots.txt` action and contains updates to our robots.txt file, pulled from [Dark Visitors](https://darkvisitors.com/).'
});
github.rest.issues.addLabels({
owner,
repo,
issue_number: result.data.number,
labels: ['automated pr']
});
token: ${{ secrets.GITHUB_TOKEN }}
branch: robots.txt-update
title: "Update robots.txt"
commit-message: "Update robots.txt"
labels: 'robots.txt'
add-paths: public/robots.txt
reviewers: hobgoblina,mannazsci,sneakers-the-rat
body: This PR was generated by the `Update robots.txt` action and contains updates to our robots.txt file, pulled from [Dark Visitors](https://darkvisitors.com/).
22 changes: 22 additions & 0 deletions robots-base.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
# .__---~~~(~~-_.
# _-' ) -~~- ) _-" )_
# ( ( `-,_..`.,_--_ '_,)_
# ( -_) ( -_-~ -_ `, )
# (_ -_ _-~-__-~`, ,' )__-'))--___--~~~--__--~~--___--__..
# _ ~`_-'( (____;--==,,_))))--___--~~~--__--~~--__----~~~'`=__-~+_-_.
# (@) (@) ````` `-_(())_-~
#
# ,---. .=-.-..-._ ,-,--.
# _..---. .-.,.---. .--.' \ /==/_ /==/ \ .-._ ,-.'- _\
# .' .'.-. \ /==/ ` \ \==\-/\ \ |==|, ||==|, \/ /, /==/_ ,_.'
# /==/- '=' /|==|-, .=., |/==/-|_\ | |==| ||==|- \| |\==\ \
# |==|-, ' |==| '=' /\==\, - \ |==|- ||==| , | -| \==\ -\
# |==| .=. \|==|- , .' /==/ - ,| |==| ,||==| - _ | _\==\ ,\
# /==/- '=' ,|==|_ . ,'./==/- /\ - \|==|- ||==| /\ , |/==/\/ _ |
# |==| - //==/ /\ , )==\ _.\=\.-'/==/. //==/, | |- |\==\ - , /
# `-._`.___,' `--`-`--`--' `--` `--`-` `--`./ `--` `--`---'

User-agent: *
Disallow: /media_proxy/
Disallow: /interact/

0 comments on commit c5aba35

Please sign in to comment.