Added fetchers concept: seperate scripts to fetch the feeds
Fetchers claim to be a certain client. They try to send the same headers as the original client. That's better than a simple curl request with a fake user agent, because curl doesn't send the other headers like the original client and therefore its traffic stands out.
This commit is contained in:
11
fetchers/firefox
Executable file
11
fetchers/firefox
Executable file
@ -0,0 +1,11 @@
|
||||
#!/bin/sh
|
||||
set -x
|
||||
#Tries more or less to look like Firefox
|
||||
if [ $# -ne 2 ] ; then
|
||||
echo "usage: $0 url output" 1>&2
|
||||
exit 1
|
||||
fi
|
||||
#better randomize
|
||||
useragent=$(shuf -n 1 $RANDRSS_ROOT/fetchers/firefox_agents)
|
||||
|
||||
curl "$1" -H "User-Agent: $useragent" -H 'Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8' -H 'Accept-Language: en-US,en;q=0.5' -H 'Accept-Encoding: gzip, deflate, br' --compressed -H 'Connection: keep-alive' -H 'Upgrade-Insecure-Requests: 1' > $2
|
Viittaa uudesa ongelmassa
Block a user