GCP200DEBATCH

Build Batch Data Pipelines on Google Cloud

In this intermediate course, you will learn to design, build, and optimize robust batch data pipelines on Google Cloud. Moving beyond fundamental data handling, you will explore large-scale data transformations and efficient workflow orchestration, essential for timely business intelligence and critical reporting.

Get hands-on practice using Dataflow for Apache Beam and Serverless for Apache Spark (Dataproc Serverless) for implementation, and tackle crucial considerations for data quality, monitoring, and alerting to ensure pipeline reliability and operational excellence.

Google Cloud
✓ Official training Google CloudLevel Intermediate⏱️ 1 day (7h)

What you will learn

  • Determine whether batch data pipelines are the correct choice for your business use case.
  • Design and build scalable batch data pipelines for high-volume ingestion and transformation.
  • Implement data quality controls within batch pipelines to ensure data integrity.
  • Orchestrate, manage, and monitor batch data pipeline workflows, implementing error handling and observability using logging and monitoring tools.

Prerequisites

  • Basic proficiency with Data Warehousing and ETL/ELT concepts
  • Basic proficiency in SQL
  • Basic programming knowledge (Python recommended)
  • Familiarity with gcloud CLI and the Google Cloud console
  • Familiarity with core Google Cloud concepts and services

Target audience

  • Data Engineers, Data Analysts

Training Program

4 modules to master the fundamentals

Objectives
  • Learn the critical role of a data engineer in developing and maintaining batch data pipelines
  • Understand core components and lifecycle of batch data pipelines
  • Analyze common challenges in batch data processing
  • Identify key Google Cloud services that address these challenges
Topics covered
  • →Batch data pipelines and their use cases
  • →Processing and common challenges
Activities

Quiz

Objectives
  • Design scalable batch data pipelines for high-volume data ingestion and transformation
  • Optimize batch jobs for high throughput and cost-efficiency using various resource management and performance tuning techniques
Topics covered
  • →Design batch pipelines
  • →Large scale data transformations
  • →Dataflow and Serverless for Apache Spark
  • →Data connections and orchestration
  • →Execute an Apache Spark pipeline
  • →Optimize batch pipeline performance
Activities

Quiz

Lab: Build a Simple Batch Data Pipeline with Serverless for Apache Spark

Lab: Build a Simple Batch Data Pipeline with Dataflow Job Builder UI

Objectives
  • Develop data validation rules and cleansing logic to ensure data quality within batch pipelines
  • Implement strategies for managing schema evolution and performing data deduplication in large datasets
Topics covered
  • →Batch data validation and cleansing
  • →Log and analyze errors
  • →Schema evolution for batch pipelines
  • →Data integrity and duplication
  • →Deduplication with Serverless for Apache Spark
  • →Deduplication with Dataflow
Activities

Quiz

Lab: Validate Data Quality in a Batch Pipeline with Serverless for Apache Spark

Objectives
  • Orchestrate complex batch data pipeline workflows for efficient scheduling and lineage tracking
  • Implement robust error handling, monitoring, and observability for batch data pipelines
Topics covered
  • →Orchestration for batch processing
  • →Cloud Composer
  • →Unified observability
  • →Alerts and troubleshooting
  • →Visual pipeline management
Activities

Quiz

Lab: Building Batch Pipelines in Cloud Data Fusion

Related Trainings

AWS

Advanced Architecting on AWS

In this course, each module presents a scenario with an architectural challenge to be solved. You will examine available AWS services and features as solutions to the problem. You will gain insights by participating in problem-based discussions and learning about the AWS services that you could apply to meet the challenges. Over 3 days, the course goes beyond the basics of a cloud infrastructure and covers topics to meet a variety of needs for AWS customers. Course modules focus on managing multiple AWS accounts, hybrid connectivity and devices, networking with a focus on AWS Transit Gateway connectivity, container services, automation tools for continuous integration/continuous delivery (CI/CD), security and distributed denial of service (DDoS) protection, data lakes and data stores, edge services, migration options, and managing costs. The course concludes by presenting you with scenarios and challenging you to identify the best solutions.

3 d
Advanced
AWS
Best

Architecting on AWS

Architecting on AWS is for solutions architects, solution-design engineers, and developers seeking an understanding of AWS architecting. In this course, you will learn to identify services and features to build resilient, secure, and highly available IT solutions on the AWS Cloud. Architectural solutions differ depending on industry, types of applications, and business size. AWS Authorized Instructors emphasize best practices using the AWS Well-Architected Framework, and guide you through the process of designing optimal IT solutions based on real-life scenarios. The modules focus on account security, networking, compute, storage, databases, monitoring, automation, containers, serverless architecture, edge services, and backup and recovery. At the end of the course, you will practice building a solution and apply what you have learned.

3 d
Intermediate

Upcoming sessions

No date suits you?

We regularly organize new sessions. Contact us to find out about upcoming dates or to schedule a session at a date of your choice.

Register for a custom date

Quality Process

SFEIR Institute's commitment: an excellence approach to ensure the quality and success of all our training programs. Learn more about our quality approach

Teaching Methods Used
  • Lectures / Theoretical Slides — Presentation of concepts using visual aids (PowerPoint, PDF).
  • Technical Demonstration (Demos) — The instructor performs a task or procedure while students observe.
  • Guided Labs — Guided practical exercises on software, hardware, or technical environments.
  • Quiz / MCQ — Quick knowledge check (paper-based or digital via tools like Kahoot/Klaxoon).
Evaluation and Monitoring System

The achievement of training objectives is evaluated at multiple levels to ensure quality:

  • Continuous Knowledge Assessment : Verification of knowledge throughout the training via participatory methods (quizzes, practical exercises, case studies) under instructor supervision.
  • Progress Measurement : Comparative self-assessment system including an initial diagnostic to determine the starting level, followed by a final evaluation to validate skills development.
  • Quality Evaluation : End-of-session satisfaction questionnaire to measure the relevance and effectiveness of the training as perceived by participants.

790€ excl. VAT

per learner