Answer : C project planning
Answer:
crawler
Explanation:
A crawler (also known as spider or robot) is a component of search engine that indexes web sites automatically. It's main purpose is to systematically browses the World Wide Web, typically for the purpose of Web indexing.
It does this by visiting a list of linkss, as it does this it identifies all the hyperlinks found in the web pages and copies and saves them to the list of links to visit.
Flexibility and open-mindedness
being quick to adapt to technology changes
having a positive attitude
taking initiative to solve problems
no it is not but u will need a ethernet cord