b374k
m1n1 1.01
Apache/2.2.15 (CentOS)
Linux obd60-6c49958d75-2q7cw 5.4.0-174-generic #193-Ubuntu SMP Thu Mar 7 14:29:28 UTC 2024 x86_64
uid=48(apache) gid=48(apache) groups=48(apache)
server ip : 104.21.65.202 | your ip : 10.244.126.0
safemode OFF
 >  / usr / lib64 / python2.6 /
Filename/usr/lib64/python2.6/robotparser.pyo
Size7.67 kb
Permissionrw-r--r--
Ownerapache
Create time23-Dec-2025 17:41
Last modified20-Jun-2019 19:45
Last accessed22-Apr-2026 05:24
Actionsedit | rename | delete | download (gzip)
Viewtext | code | image
Ñò
§ÚêLc@s}dZddkZddkZdgZdd d��YZdd d��YZdd
d��YZd eifd
��YZdS(s< robotparser.py

Copyright (C) 2000 Bastian Kleineidam

You can choose between two licenses when using this package:
1) GNU GPLv2
2) PSF license for Python 2.2

The robots.txt Exclusion Protocol is implemented as specified in
http://info.webcrawler.com/mak/projects/robots/norobots-rfc.html
iÿÿÿÿNtRobotFileParsercBsbeZdZdd�Zd�Zd�Zd�Zd�Zd�Zd�Z d �Z
d
�Z RS( ss This class provides a set of methods to read, parse and answer
questions about a single robots.txt file.

tcCs>g|_d|_t|_t|_|i|�d|_dS(Ni(tentriestNonet
default_entrytFalset disallow_allt allow_alltset_urlt last_checked(tselfturl((s#/usr/lib64/python2.6/robotparser.pyt__init__s     
cCs|iS(s·Returns the time the robots.txt file was last fetched.

This is useful for long-running web spiders that need to
check for new robots.txt files periodically.

(R (R
((s#/usr/lib64/python2.6/robotparser.pytmtime scCsddk}|i�|_dS(sYSets the time the robots.txt file was last fetched to the
current time.

iÿÿÿÿN(ttimeR (R
R((s#/usr/lib64/python2.6/robotparser.pytmodified)s cCs/||_ti|�dd!\|_|_dS(s,Sets the URL referring to a robots.txt file.iiN(R turlparsethosttpath(R
R ((s#/usr/lib64/python2.6/robotparser.pyR1s cCs¾t�}|i|i�}g}|D]}||i�q&~}|i�|i|_|idjo
t|_nF|idjo
t|_n)|idjo|o|i |�ndS(s4Reads the robots.txt URL and feeds it to the parser.i�i�i�iÈN(i�i�(
t URLopenertopenR tstriptcloseterrcodetTrueRRtparse(R
topenertft_[1]tlinetlines((s#/usr/lib64/python2.6/robotparser.pytread6s '
 

cCsEd|ijo!|idjo
||_qAn|ii|�dS(Nt*(t
useragentsRRRtappend(R
tentry((s#/usr/lib64/python2.6/robotparser.pyt
_add_entryDscCs6d}d}t�}xü|D]ô}|d7}|pQ|djot�}d}q�|djo |i|�t�}d}q�n|id�}|djo|| }n|i�}|pqn|idd�}t|�djo#|di�i�|d<ti|di��|d<|ddjoE|djo|i|�t�}n|i i
|d�d}q|ddjo8|djo'|i i
t |dt
��d}q q|ddjo8|djo'|i i
t |dt��d}q qqqW|djo|i|�nd S(
s�parse the input lines from a robots.txt file.
We allow that a user-agent: line is not preceded by
one or more blank lines.iiit#t:s
user-agenttdisallowtallowN(tEntryR$tfindRtsplittlentlowerturllibtunquoteR!R"t rulelinestRuleLineRR(R
Rtstatet
linenumberR#Rti((s#/usr/lib64/python2.6/robotparser.pyRMsP 

 


 
 






cCs�|iotS|iotStititi|��d�pd}x/|iD]$}|i |�o|i
|�SqTW|i o|i i
|�StS(s=using the parsed robots.txt decide if useragent can fetch urlit/( RRRRR.tquoteRR/Rt
applies_tot allowanceR(R
t useragentR R#((s#/usr/lib64/python2.6/robotparser.pyt can_fetch�s

,

cCs5dig}|iD]}|t|�dq~�S(NRs
(tjoinRtstr(R
RR#((s#/usr/lib64/python2.6/robotparser.pyt__str__�s( t__name__t
__module__t__doc__R R
RRRR$RR:R=(((s#/usr/lib64/python2.6/robotparser.pyRs     3 R1cBs)eZdZd�Zd�Zd�ZRS(soA rule line is a single "Allow:" (allowance==True) or "Disallow:"
(allowance==False) followed by a path.cCs>|djo| o
t}nti|�|_||_dS(NR(RR.R6RR8(R
RR8((s#/usr/lib64/python2.6/robotparser.pyR �s
cCs |idjp|i|i�S(NR (Rt
startswith(R
tfilename((s#/usr/lib64/python2.6/robotparser.pyR7¡scCs |iodpdd|iS(NtAllowtDisallows: (R8R(R
((s#/usr/lib64/python2.6/robotparser.pyR=¤s(R>R?R@R R7R=(((s#/usr/lib64/python2.6/robotparser.pyR1�s  R)cBs2eZdZd�Zd�Zd�Zd�ZRS(s?An entry has one or more user-agents and zero or more rulelinescCsg|_g|_dS(N(R!R0(R
((s#/usr/lib64/python2.6/robotparser.pyR ªs cCsjg}x'|iD]}|id|dg�qWx*|iD]}|it|�dg�q:Wdi|�S(Ns User-agent: s
R(R!textendR0R<R;(R
trettagentR((s#/usr/lib64/python2.6/robotparser.pyR=®s

cCsa|id�di�}xA|iD]6}|djotS|i�}||jotSq#WtS(s2check if this entry applies to the specified agentR5iR (R+R-R!RR(R
R9RG((s#/usr/lib64/python2.6/robotparser.pyR7¶s

 
 cCs0x)|iD]}|i|�o|iSq
WtS(sZPreconditions:
- our agent applies to this entry
- filename is URL decoded(R0R7R8R(R
RBR((s#/usr/lib64/python2.6/robotparser.pyR8Ãs

 (R>R?R@R R=R7R8(((s#/usr/lib64/python2.6/robotparser.pyR)¨s
  
RcBs#eZd�Zd�Zd�ZRS(cGs tii||�d|_dS(NiÈ(R.tFancyURLopenerR R(R
targs((s#/usr/lib64/python2.6/robotparser.pyR ÍscCsdS(N(NN(R(R
Rtrealm((s#/usr/lib64/python2.6/robotparser.pytprompt_user_passwdÑscCs(||_tii||||||�S(N(RR.RHthttp_error_default(R
R tfpRterrmsgtheaders((s#/usr/lib64/python2.6/robotparser.pyRLÖs (R>R?R RKRL(((s#/usr/lib64/python2.6/robotparser.pyRÌs  (((( R@RR.t__all__RR1R)RHR(((s#/usr/lib64/python2.6/robotparser.pyt<module> s   �$