题目内容

“Crawl-delay:” parameter in the robots.txt file asks crawlers to delay for a number of seconds between requests.

A. 对
B. 错

查看答案
更多问题

A parallel crawler can run multiple processes in parallel, as a result of which the download rate is maximized, the overhead is minimized and repeated downloads of the same page are avoided.

A. 对
B. 错

The new URLs discovered during the crawling process are assigned so that the same page is not downloaded more than once.

A. 对
B. 错

有如下程序,#includeint main(){ int a1,a2; char c1,c2; scanf("%d%d",&a1,&a2); scanf("%c%c",&c1,&c2);} 若要求a1、a2、c1、c2的值分别为10、20、A、B,正确的数据输入是()(注:□表示空格,表示回车)

A. 1020AB
B. 10□20AB
C. 10□20□ABC
D. 10□20AB

若x、y均定义为int型,z定义为double型,以下不合法的scanf()函数调用语句是()

A. scanf("%d%1x,%1e",&x,&y,&z);
B. scanf("%2d%d%1f",&x,&y,&z);
C. scanf("%x%*%o",&x,&y,&z);
D. scanf("%x%ox,%6.3f",&x,&y,&z);

答案查题题库