Every now and then I’m fortunate enough to be able to publish guest posts by illustrious people in the analytics and digital marketing industries. This time, I get to work with an old colleague of mine who’s a veritable wizard when it comes to building solutions that make our daily work in the digital industry so much smoother.

Erik Grönroos works as an analyst in MarkkinointiAkatemia, a Finnish digital customer acquisition agency. Erik’s job is to utilize data and analytics to help grow the agency as well as the companies of the clients he works with.

In this guide, Erik will share with you how to build an application which helps you automate the monitoring of your Google Ads accounts using the R programming language, the AdWords API and the Slack API.

I’m particularly excited to publish this article, as it’s the first time (if I recall correctly) that Google Ads and R are covered in the blog.

Anyway, as the editor, all the mistake and errors that you can find in the article are 100% mea culpa, so let me know in the comments if errors persisted to the final draft. Thank you!


The Simmer Newsletter

Subscribe to the Simmer newsletter to get the latest news and content from Simo Ahava into your email inbox!


In this article I’ll show you how to implement an automated monitoring application for keeping an eye on your Google Ads accounts. Since the nature of the implementation is a Proof of Concept (PoC), I’ll try to keep it simple by using just one practical use case for demonstrating the end-to-end pipeline:

Zero-cost detection in Ads accounts, for monitoring credit card expirations or other issues that prevent your ads from rolling.

This implementation utilizes R and the Google AdWords API for checking the data, Slack API for realtime alerting via your Slack bot if something critical is detected, and good ol’ cron for scheduling the script.

If you’re a marketing agency, a typical situation is that you have plenty of Google Ads accounts to take care of. When the number of accounts breaches the hundreds or the thousands, manually managing, monitoring, and maintaining them isn’t really an option anymore.

What you can do, of course, is set up Google Ads’ built-in alerts one by one for each of your accounts. You could even manage these alerts via the Google Ads API or utilize Google Ads scripts. But the problem with these is that they involve email alerts by default. Hey, it’s 2019. Email is probably not the only channel you should use for building automation on.

So, we decided to harness the force of APIs and, as a result, saved the galaxy!

How the solution works

The script is populated with the list of Google Ads account IDs we want to monitor. The monitoring system itself queries the cost data of each ID using the Google AdWords API, and parses the results for those accounts that have accumulated no cost at all, signalling potential issues with your budgets or your configurations.

When this information has been parsed, the account IDs that were flagged are written to a database for further analysis, and a notification is dispatched to the Slack bot so that the team responsible can be immediately notified when problems arise.


About R

R is a statistical programming language that today supports pretty much the same Machine Learning (ML) tricks that Python does. R vs. Python is a long-standing debate, so it really boils down to language preference: which one do you want to use for cool data stuff! Okay, I’ll put my pants back on, hold on.

We don’t need ML in this simple monitoring example, but we do use it for more advanced solutions. That’s why R is the tool of choice for us.

RStudio is an IDE for R, it makes R easier to use. Quote from RStudio’s website:

RStudio is an integrated development environment (IDE) for R. It includes a console, syntax-highlighting editor that supports direct code execution, as well as tools for plotting, history, debugging and workspace management.

RStudio is available in open source and commercial editions and runs on the desktop (Windows, Mac, and Linux) or in a browser connected to RStudio Server or RStudio Server Pro (Debian/Ubuntu, RedHat/CentOS, and SUSE Linux).

If you don’t have RStudio installed yet, grab yours from RStudio downloads.

We won’t get into the RStudio installation details in this article. RStudio’s website provides comprehensive documentation and step-by-step how-to’s for that.

Once you have your RStudio setup up and running, let’s script!

Episode I: An R script

We are using MariaDB to store the Google Ads Customer IDs of our clients, where they can be retrieved from for several other integration and automation purposes we need.

If you’re using some other database server, you need to modify the DB part of the code to suit your needs. R has comprehensive support for a wide variety of databases, perhaps even all of them. Basically, if it’s something to do with data, R can read from it.

If you’re not using a database, you can skip the DB part. Optionally, R can also read the IDs from e.g. Google Sheets or a CSV file. How to do this in R is your homework!

You’re probably thinking “But why aren’t you reading the IDs directly from AdWords API”?

Well, the unfortunate thing is that RAdwords, the R AdWords package, implements the Adwords API Reporting Service only, and the Reporting Service does not support querying Google Ads Customer IDs on the Manager Account level (formerly known as My Client Center or Google MCC). Instead, you’d need access to AdWords API Managed Customer Service to do that.

In our case, this isn’t a problem though, because we want to persist the IDs in our own custom environment and have different classifications and levels for different sort of contracts and clients. This way we are able to do the desired actions to those clients that need them, and we can also add monitoring capabilities to the setup.

If you need to read the IDs directly from the AdWords API, you can access the Managed Customer Service via a newer R package called adwordsR.

Now, let’s script for real!

Create a new R file in your RStudio with the following content, set up your DB user and pass, and save the file as db_credentials.R in /YOUR_R_USER_ROOT_DIR/auth/. This is your DB auth file.

db_username = "YOUR_USERNAME"
db_password = "YOUR_PASSWORD"

Create another file including the following code and save the file as connect_db.R in /YOUR_R_USER_ROOT_DIR/. This is your DB connector.

# Include DB credentials from your /auth/ dir

# Check if the required packages are installed, install if not
if(!require(DBI)) install.packages("DBI") 
if(!require(RMariaDB)) install.packages("RMariaDB")

# Load packages

# Create a DB connection handle
con <- dbConnect(
  drv = RMariaDB::MariaDB(),
  username = db_username,
  password = db_password,
  host = "YOUR_DB_HOST",
  port = 3306,

And finally the actual Ads R script, include the following code and save it as google-ads-zero-cost-monitor.R in /YOUR_R_USER_ROOT_DIR/.

# Init the DB connection

# Check if the required packages are installed, install if not
if(!require(RAdwords)) install.packages("RAdwords")

# Load packages

# Define the file path where your AdWords API OAuth RData file should be stored to, we'll need this next
google_ads_auth_file <- "~/auth/google_ads_auth.RData"

# Check if the google_auth RData file already exists, create if not
if(!file.exists(google_ads_auth_file)) {
  # Auth steps:
  # 1) You'll need Google API Client ID, Client Secret and AdWords Developer Token here for authenticating, see https://github.com/jburkhardt/RAdwords for details
  # 2) Once you've provided those in your R console when R requests them, you'll get redirected to OAuth screen in your browser
  # 3) Authenticate with your Google Account (the same that has access to your AdWords account)
  # 4) You'll be redirected to a OAuth success screen that provides a final token code and asks you to copy/paste it to your application (to your R console in this case)
  google_auth <- doAuth(save = FALSE)
  # Save the auth credentials, to be used the auth from a file from now on
  save(google_auth, file = google_ads_auth_file)

# Load Google auth credentials
load(file = google_ads_auth_file)

# Init date vars
dayBeforeYesterday <- Sys.Date() - 2
yesterday <- Sys.Date() - 1

# Fetch Google Ads Customer IDs from your DB
ads_accounts <- dbGetQuery(con, "SELECT ads_customer_id FROM ads_accounts")

# Create a list of the IDs for later looping
account_list <- ads_accounts$ads_customer_id

# Alternatively, if you want to skip the DB stuff, you can define your Google Ads Customer IDs manually by uncommenting the following line and listing the IDs here:
# account_list <- c("123-123-1234", "234-234-2345")

# Init AdWords API request
# We're checking the cost data from two previous days
body <- statement(select = c('Cost'),
                  report = "ACCOUNT_PERFORMANCE_REPORT",
                  start  = dayBeforeYesterday,
                  end    = yesterday)

# Init a data frame for collecting the data and and an index variable for keeping track of progressing during the for loop
adsdata <- data.frame()
progressIndex <- 0

# Get the Ads cost data
for (account in account_list) {

  # Get cost data
  ads_api_response <- getData(clientCustomerId = account,
                              google_auth      = google_auth,
                              statement        = body)

  # Proceed only if the API doesn't return an error (happens when there's no data to return or when the API fails), this is for dropping off false zero cost alerts
  if(!is.null(dim(ads_api_response))) {
    # Init empty data frame
    adsdata_temp <- setNames(data.frame(matrix(ncol = 2, nrow = 1)), c("Cost", "Account"))
    # Include account id
    adsdata_temp$Account <- account
    # Include cost data
    # If there's no data, the result is doubled for some reason
    # Fix this by always reading the 1st value only with [1]
    adsdata_temp$Cost <- ads_api_response$Cost[1]
    # If cost is zero, NA is returned, replace it with zero
    if(is.na(adsdata_temp$Cost)) adsdata_temp$Cost <- 0
    # Collect during each iteration
    adsdata <- rbind(adsdata, adsdata_temp, stringsAsFactors = FALSE)
  # Print progress to R console
  progressIndex <- progressIndex + 1
  cat(paste("Query ", progressIndex, " of ", NROW(account_list), " done\r\n", sep = ""))


# Keep accounts with zero cost only
ads_accounts_with_zero_cost <- subset(adsdata, Cost == 0)

# Count nbr of alerted accounts
nbr_of_accounts_with_alerts <- nrow(ads_accounts_with_zero_cost)

# Save all alerts to DB, if there was any
# This assumes you have DB table named "alerts" existing with fields: 
# ads_customer_id VARCHAR
# category VARCHAR
# status_id INT
# created_at DATETIME
# updated_at DATETIME

if(nbr_of_accounts_with_alerts > 0) {
  for (i in 1:nrow(ads_accounts_with_zero_cost)) {
    row <- ads_accounts_with_zero_cost[i, ]
    ads_customer_id <- row$ads_customer_id
    category <- "ALERT_ZERO_COST" # For categorizing different types of alerts
    status_id <- 0 # To be set to eg. 1 when the alert is handled via a custom UI
    # Save each item to db
    query <- paste("INSERT INTO alerts (
                   VALUES (",
                    "'", ads_customer_id, "', ",
                    "'", category, "', ",
                    status_id, ", ",
                    "now(), ",
                   sep = ""
    dbExecute(con, query)
# If you want to check/debug what was stored, uncomment and run these lines
# db_alerts <- dbGetQuery(con, "SELECT * FROM alerts")
# View(db_alerts)

Now you have your Ads zero cost monitor script ready and you can move on to setting up your Slack bot.

Episode II: The Slack bot strikes back

Because the guys at Slack are awesome, they have written an excellent guide on how to set up your bot. They even serve you virtual cookies! Slack’s guides are a textbook example about how documentation should be written - they’re always a pleasure to read and follow! Thanks guys!

So, follow the Slack bot users guide. The required steps for our alert notifications are:

  1. Create your Slack app

  2. Create your bot user

  3. Install your bot to your Slack workspace

Once you manage to get your bot installed into your Slack workspace, you will get a webhook URL that you’ll need in the following script.

The bot script part is here:


# Replace the default value with the webhook URL you got from Slack
webhook_url <- "https://hooks.slack.com/services/12345"

# Check if the required packages are installed, install if not
if(!require(jsonlite)) install.packages("jsonlite") 
if(!require(httr)) install.packages("httr")

# Load packages

# Nbr of accounts for the Slack notification
nbr_of_accounts <- NROW(account_list)

# Setup the message you want to send to Slack
msg_desc <- paste("Number of Google Ads accounts checked: ",
                  sep = "")

msg_title <- paste("ALERT - Number of zero cost accounts detected: ",
                   sep = "")

# This places a channel ping if there were occurrences with zero cost, and if not, defaults to empty message
if(nbr_of_accounts_with_alerts > 0) {
  msg_body <- "Alerts saved to database! <!channel>"
} else {
  msg_body <- ""

# Alert to Slack - POST the msg
request <- POST(webhook_url,
                body = paste(
                  '{"attachments": [{',
                  '"title": "',
                  '"pretext": "',
                  '"text": "',
                  '", "color": "#ee0000"',
                  sep = ''

Once you’ve set this up, include the above Slack part into your google-ads-zero-cost-monitor.R script. Your monitoring script is now done!

This is what a sample notification would look like in Slack:

Test it carefully in your RStudio and make sure everything is working correctly. After that, you’re ready to move your script to production by scheduling it via cron.

Episode III: Return of the cron job

Open crontab in your Linux shell for editing:

crontab -e

Add a cron job that suits your needs. Here’s an example syntax with a scheduler that runs the R monitoring script daily at 10 AM.

# Daily at 10 AM
0 10 * * * Rscript /home/YOUR_USERNAME/google-ads-zero-cost-monitor.R


You have now created a potentially powerful Google Ads monitoring pipeline that has endless opportunities for extending. Other useful use cases for utilizing the pipeline are for example anomaly detection, high spend tracking, campaign-level stuff, automation of ads placement optimizing, and so forth.

It goes without saying that automation is absolutely necessary in this day and age when it can save you and your team from countless hours of manual labor. By automating trivial processes such as monitoring and anomaly detection, you are freeing up your most valuable resources (people) to work on more ambitious tasks.

As always, we’re looking forward to your comments and questions. Let us know what you think of the solution, if it could be improved, and if you already have extensions in mind!