GoPro , the popular action camera company ‘s findings were determined by mostly using web scraping techniques. Alternative data company Eagle Alpha in the reports they published in 2015 had mentioned that “The data from US electronics websites pointed to potential weakness in GoPro revenue for the third quarter of that year.”
The data showed a reduced demand for GoPro products with a negative mix shift to lower end products impacting average selling prices. The reports also suggested that there was a weakness in ranking of the bestselling cameras which included the session product that had recently been released. Even though 68 per cent of the total recommendations at the time suggested that GoPro was a ‘buy’ , Eagle Alpha had correctly determined that the company would underperform and miss the targets for the quarter and so it happened! They also said that the demand for GoPro was weakening and that the average selling price remained under pressure.
Nicholas Woodman, the Chief executive officer of GoPro addressed the issues saying “While we experienced strong year-over-year growth, this quarter marks the first time as a public traded company that we delivered results below the expectations that we outlined in our guidance.” The web scrapping method paved the way for the company to analyze and identify the underperformance well ahead of the traditional technology.
What is web scraping? Web scraping refers to the process of harvesting data from public websites and identifying what might be the deemed valuable to the user, such as hedge fund. It is carried out typically by using a high-powered software.
The typical process of web scraping works in two distinct steps : A web crawler or a ‘spider’ leads the process. This is done by using artificial intelligence (AI) to browse the internet and the relevant websites. The content is then passed on to the web scraper , called as a specialized tool to designed to extract a quick and accurate data from a web page. And Eagle Alpha is one of the biggest proponent of web scraping. The company was founded in 2012 , largest company in the alternative data space. According to the firm’s director of data insights, Ronan Crosson, who spoke to Forbes in December 2019, the firm has compiled a taxonomy of 24 different types of alternative data, with the most commonly deployed alternative datasets being “web scraped data, credit card data and consumer sentiment data”.
The highly monitored platforms are social media sites, to identify the changes in the expressions regarding the purchases. Especially when the companies are now increasingly announcing the developments of likes in the social media handles like Facebook and Twitter. The data from these platforms acts a crucial part for their information-driven trading strategies, helping them uncover early trends and changes in sentiment. Twitters event driven feeds project help most of the companies’ quantitative traders to capitalize on the influence of social media on markets through constant evolution curating technology. It also includes Natural Language Processing ( NPL ) modelling , coupled with the company’s reputation for data quality.
“Ultimately the bottom line is scraping data is big business, and it’s only going to get bigger as hedge funds begin to establish it as an industry-standard tool,” says Daniel Ni , the founder of web scraping tool Scraper API. Having access to online activity of the consumers is a massive tool for someone whose job itself is to predict where the consumer’s money is going to move. We are only at the beginning of realization of how powerful a tool can web scraping be and what value it can generate.