The process of performing a website data scrape involves using computer software to extract specific information from a number of websites. Such software is written to simulate the process by which a human would search for information. This data is collected for a number of reasons, at first to facilitate the process by which companies would monitor online competitor pricing, conduct research, and various other purposes.
The technology has been growing, and has been made available to a wider variety of users in an affordable fashion, such as the Mozenda website data scrape software. Such services now allow users and companies to perform data scraping without extensive knowledge of computer programming.