Client Background

  • Client: A leading legal firm in the USA
  • Industry Type: Legal
  • Products & Services: Legal services
  • Organization Size: 120+

The Problem

Managing and optimizing Local Services Ads (LSAs) for multiple law firms requires real-time performance tracking, precise cost management, and seamless integration with lead management systems like Lead Docket. The absence of automated syncing between ad performance data and lead tracking often leads to inefficiencies, inaccurate reporting, delayed insights, and suboptimal ad spend.

Our Solution

We developed an end-to-end automation tool that fetches performance data from Google Local Services Ads and synchronizes it with Lead Docket data. This tool processes, cleans, and stores the data in Google BigQuery and powers a Looker Studio dashboard to provide real-time, interactive performance reports. Additionally, it supports automated bid adjustments, ad content updates, and ad pausing logic based on service availability, ensuring optimal ad spend and inventory alignment.

Solution Architecture

  1. Authentication Layer
    1. Google OAuth 2.0 for secure access to APIs
    2. Google Service Account for BigQuery operations
  2. Data Extraction Modules
    1. LSA_main() to retrieve reports from LSA API
    2. Getting_LSA_dataframe() to process account-specific reports
    3. Integration with Lead Docket API for lead data
  3. Preprocessing Layer
    1. Data_preprocessing() handles cleaning, renaming, handling nulls, and computing KPIs like Missed Calls, Avg CPL
  4. Storage Layer
    1. Google BigQuery for storing processed data
    2. Dynamic table creation via BigQueryTableCreation()
    3. Data loading through TobigQuery()
  5. Reporting Layer
    1. Looker Studio dashboard visualizing real-time KPIs from both LSA and Lead Docket sources
  6. Automation Layer
    1. Backend script runs daily to update ad data and adjust campaigns
    2. Automatic bid adjustments, pausing of ads for unavailable services

Deliverables

  • Fully automated backend data pipeline for LSA and Lead Docket data
  • BigQuery database with structured, cleaned tables
  • Looker Studio dashboard with combined metrics
  • Secure access control for internal use only
  • Daily synchronization script for ad performance and lead data

Tech Stack

  • Tools used
  • Google Cloud Platform (BigQuery, OAuth 2.0, IAM)
  • Looker Studio (for reporting)
  • Lead Docket API
  • Google Local Services Ads API
  • Language/techniques used
  • Python (requests, pandas, numpy, json, datetime, pytz)
  • SQL (BigQuery SQL for transformations)
  • RESTful APIs for data fetch and interaction
  • Skills used
  • Data engineering and pipeline development
  • API integration
  • Data visualization and dashboarding
  • Secure authentication and access control
  • Automation and scheduling

What are the technical Challenges Faced during Project Execution

  1. Discrepancies in API Data: Pending leads that should be credited still appeared as pending.
  2. Timezone Inconsistencies: Data inconsistencies due to differences in server and local time zones.
  3. Schema Variability: Dynamic API response structures required flexible data handling.
  4. Access Control: Ensuring only authorized personnel had access to sensitive campaign data.

How the Technical Challenges were Solved

  1. Pending Lead Discrepancy: Implemented logic to flag and cross-verify pending leads against business rules to estimate missing credits.
  2. Timezone Issues: Utilized pytz and server-local normalization to ensure consistent datetime formatting.
  3. Flexible Schema Handling: Used dynamic dataframe creation and validation checks to adapt to changing API structures.
  4. Secure Access: Integrated OAuth 2.0 with scoped permissions and service account keys to enforce access control policies.

Project Snapshots

image 1: complete report 

image 2 : data from lead docket