Skip to main content

· 6 min read
ScrapingAnt Team

Web Scraping for Data Scientists

Data is all around us, and scientists train themselves to question everything. Scientists usually spend hours studying data in their specific field to facilitate learning, understanding, innovation.

However, to procure the volume of data necessary, scientists often need help from computer programs and AI technology. Many times, the correct technology for this job is a web scraping tool.

This article will explain the uses of web scraping for data scientists, information about web scraping, and why ScrapingAnt can help you get the information you need.

· 9 min read
Oleg Kulyk

Web Scraping with Deno

Dynamic languages are helpful tools for web scraping. Scripting allows users to rapidly tie together complex systems or libraries and express ideas without dealing with memory management or build systems.

JavaScript is the most popularly used dynamic language, operating on every device with a web browser, and Node.js as a JS runtime proved to be a very successful software platform. Due to design mistakes, it became hard to evolve with an existing user base, so Deno was born to resolve all the problems. Let's find out how to scrape the web and dynamic websites with Deno.

· 10 min read
Oleg Kulyk

Scrape a Dynamic Website with Python

Internet extends fast and modern websites pretty often use dynamic content load mechanisms to provide the best user experience. Still, on the other hand, it becomes harder to extract data from such web pages, as it requires the execution of internal Javascript in the page context while scraping. Let's review several conventional techniques that allow data extraction from dynamic websites using Python.

· 12 min read
Oleg Kulyk

Web Scraping with Javascript

Javascript (JS) becomes more popular as a programming language for web scraping. The whole domain becomes more demanded, and more technical specialists try to start data mining with a handy scripting language. Let's check out the main concepts of web scraping with Javascript and review the most popular libraries to improve data extraction flow.

· 7 min read
Oleg Kulyk

6 Puppeteer Tricks to Avoid Detection and Make Web Scraping Easier

As you know, Puppeteer is a high-level API to control headless Chrome, and it's probably one of the most popular web scraping tools on the Internet. The only problem is that an average web developer might be overloaded by tons of possible settings for a proper web scraping setup.

I want to share 6 handy and pretty obvious tricks that should help web developers to increase web scraper success rate, improve performance and avoid bans.

· 4 min read
Oleg Kulyk

How to use a proxy in Playwright?

Playwright is a high-level API to control and automate headless Chrome (Chromium), Firefox and Webkit. It can be considered as an extended Puppeteer, as it allows using more browser types to automate modern web apps testing and scraping. Playwright API can be used in JavaScript & TypeScript, Python, C# and, Java. In this article, we are going to show how to set up a proxy in Playwright for all the supported browsers.

· 4 min read
Oleg Kulyk

How to use rotating proxies with Puppeteer?

Puppeteer is a high-level API to control headless Chrome. Most things that you can do manually in the browser can be done using Puppeteer, so it quickly became one of the most popular web scraping tool in Node.js and Python. Many developers use it for a single page applications (SPA) data extraction as it allows executing client-side Javascript. In this article, we are going to show how to set up a proxy in Puppeteer and how to spin up your own rotating proxy server.

· 4 min read
Oleg Kulyk

How to use Microsoft Edge with Playwright

Web scraping a website with the actually supported or other browsers has a real benefit in ensuring that the scraper will not be banned by the fingerprint or the behavioral pattern. Playwright already provides full support for Chromium, Firefox, and WebKit out of the box without installing the browsers manually, but since most of the users out there use Google Chrome or Microsoft Edge instead of the open-source Chromium variant, in some scenarios, it's safer to use them to emulate a more realistic browser environment.