by François Joly | Sep 23, 2022 | R'
step 1 : getting the URLs This one is going to be quick, we will use the xsitemap package which crawls XML sitemap library(xsitemap) library(urltools) library(XML) library(httr) upload <- xsitemapGet("https://www.rforseo.com/sitemap.xml") 12345...
by François Joly | Feb 8, 2021 | R'
Dear SEOs, I’ve made app that you migh find useful CTR by Average Position The first app computes Google Search Queries CTR by Average Position. 👉 https://gokam.shinyapps.io/ctr_pos/ With a big website it looks like this Green is the average per position, red...
by François Joly | Mar 25, 2020 | R'
Article has been migrated here It will be a long article so I added a Table of content 👇 Fancy, right? Table Of ContentsCrawl an entire website with RcrawlerThe INDEX variableHTML FilesSo how to extract metadata while crawling?Explore Crawled Data with...
by François Joly | Mar 6, 2020 | R'
If you want to crawl a couple of URLs for SEO purposes, there are many many ways to do it but one of the most reliable and versatile packages you can use is rvest Here is a simple demo from the package documentation using the IMDb website: # Package installation,...
by François Joly | Mar 2, 2020 | R'
R’ and RStudio are great but sometimes it’s better the just export your data to exploit them elsewhere or just show them to other people. Here is a review of possible techniques: Export your data into a CSV assuming your data is store inside df var, fairly...
by François Joly | Mar 1, 2020 | R'
Selenium is a very classic tool for QA and it can help perform automatic checks on a website. This is an intro of how to use it:The first step is, as always, to install and load the RSelenium package #install to run once install.packages("RSelenium")...