GCP Data Engineer — Remote (Contract, W2, OPT welcome)

By USA JOB Finder

Published On:

Join WhatsApp

Join Now

Join Telegram

Join Now

Data Engineer Remote

Department: Data Engineering / Cloud Engineering
Experience: 3–5 years
Employment Type: Contract (W2)
Location: Remote
Visa: OPT accepted
Apply / Share resume: udaykumar@arksintelitech.com


Job Overview

We’re hiring a GCP Data Engineer with 3–5 years of hands-on experience building scalable, secure, and high-performance data pipelines on Google Cloud Platform (GCP). This remote W2 contract role is ideal for engineers who specialize in ETL/ELT, data modeling, and cloud-native analytics services (BigQuery, Dataflow, Pub/Sub). If you’ve implemented production-grade pipelines, automated infrastructure with Terraform or Deployment Manager, and helped data teams deliver timely analytics and ML-ready datasets — we want to hear from you. OPT visa candidates welcome.

Keywords: GCP Data Engineer, Google Cloud, BigQuery, Dataflow, Pub/Sub, Terraform, ETL, ELT, data pipelines, remote data engineer, contract W2, OPT.


What you’ll do (Responsibilities)

  • Design, implement, and operate robust ETL/ELT pipelines on GCP to move, transform, and prepare large-scale datasets for analytics and ML.
  • Build production data workflows using BigQuery, Dataflow, Pub/Sub, Cloud Storage, and related GCP services.
  • Automate infrastructure provisioning and deployments using Terraform or Deployment Manager (IaC).
  • Monitor pipeline performance and reliability; apply optimization and cost-control best practices.
  • Troubleshoot pipeline failures, perform root-cause analysis, and implement long-term fixes.
  • Collaborate with data scientists, analysts, and product teams to understand requirements and deliver high-quality data products.
  • Maintain documentation of data models, ETL processes, runbooks, and architecture diagrams.
  • Implement data quality checks, lineage, and governance controls for reliable downstream analytics.
  • Improve security posture for data services (IAM roles, encryption, private networks) and ensure compliance with company policies.

Required Qualifications

  • 3–5 years of professional experience in data engineering or cloud engineering roles.
  • Strong experience with Google Cloud Platform — specifically BigQuery, Dataflow, Pub/Sub, Cloud Storage.
  • Hands-on ETL/ELT experience: designing, building, and maintaining production data pipelines.
  • Infrastructure-as-Code experience with Terraform or Deployment Manager.
  • Proficient in SQL and experience with performance tuning and cost optimization in BigQuery.
  • Programming experience in Python and/or Java (for Dataflow/Beam pipelines).
  • Familiarity with CI/CD for data pipelines and version control (Git).
  • Strong debugging, troubleshooting, and root-cause analysis skills.
  • Excellent communication skills and ability to work remotely in cross-functional teams.
  • Eligible to work as a W2 contract employee (OPT visa accepted).

Preferred / Nice-to-have

  • Experience with Apache Beam, Airflow (Cloud Composer), or other workflow orchestrators.
  • Knowledge of data modeling (star schema, normalized models) and OLAP design.
  • Exposure to MLOps and preparing feature stores or ML-ready datasets.
  • Familiarity with data observability tools and data quality frameworks.
  • Prior experience in a client-facing or consultancy environment.

Benefits & Perks (for contractors)

  • Remote-first contract role — work from anywhere.
  • Opportunity to work on large-scale, high-impact data projects and modern GCP stack.
  • Collaborative environment with data scientists, analysts, and cloud engineers.
  • Competitive contract pay (details provided during interview).
  • Flexible hours (role may require overlap with business hours depending on project).

Why Join / Value Proposition (SEO-friendly)

Join a fast-moving data engineering team that values automation, engineering rigor, and clean data. You’ll operate on a modern GCP stack, influence data architecture decisions, and directly enable analytics and machine learning initiatives that impact product and business outcomes. Great role for engineers who enjoy building end-to-end data systems and improving data quality, reliability, and cost-efficiency.


How to Apply (Clear CTA)

Email your resume to udaykumar@arksintelitech.com with the subject line:
GCP Data Engineer — [Your Name] — 3–5 yrs — Remote — OPT/W2

In your email, please include:

  1. A short 2–3 line summary of your experience with GCP and ETL pipelines.
  2. Key projects or achievements (BigQuery/Dataflow/Terraform examples).
  3. Your current availability / earliest start date and timezone.

SEO Meta Description (for job page)

Hiring: GCP Data Engineer (Remote, Contract, W2). 3–5 years experience required. Hands-on BigQuery, Dataflow, Pub/Sub, Terraform. OPT candidates welcome. Email resume to udaykumar@arksintelitech.com.


Social + Short Blurbs (for LinkedIn, Twitter)

LinkedIn (short):
GCP Data Engineer — Remote (Contract, W2) | 3–5 yrs | BigQuery, Dataflow, Pub/Sub, Terraform | OPT welcome. Share resume: udaykumar@arksintelitech.com

Twitter (tweet):
Now hiring: GCP Data Engineer (Remote) — 3–5 yrs — BigQuery/Dataflow/Terraform — W2 role, OPT welcome. Send resume → udaykumar@arksintelitech.com


Suggested Job Tags

GCP Data Engineer, Google Cloud, BigQuery, Dataflow, PubSub, Terraform, ETL, ELT, Remote Job, Contract, W2, OPT, Data Engineering, Cloud Engineering, Data Pipelines, Python, Apache Beam, Cloud Composer.


Leave a Comment