Website fcsglobal.us
Data Engineer (GCP Migration / Data Warehouse)
📍 Location: Mostly Remote (Occasional Onsite in Richardson)
🕒 Duration: Contract through Dec 2026 (Extension through 2027 based on performance)
🎯 Interview Process: 1 Round
🚨 Hiring Alert
We are actively looking to onboard a skilled Data Engineer for a long-term engagement supporting a critical Teradata to Google Cloud Platform (GCP) migration initiative. This is a high-impact opportunity to work on enterprise-scale data transformation and modernization projects.
The role is primarily remote, with occasional onsite presence required in Richardson, TX for collaboration and key project activities.
Role Overview
As a Data Engineer, you will play a crucial role in designing, building, and optimizing data pipelines as part of a large-scale migration from legacy Teradata systems to modern cloud-based platforms on GCP. You will also support EDP 1.5 enablement, which includes migration between GCP environments and enhancing data platform capabilities.
The ideal candidate will have strong expertise in data warehousing, SQL, Python, and GCP services, along with hands-on experience in migration and transformation projects. You should be comfortable working in a fast-paced environment and collaborating with cross-functional teams to deliver scalable data solutions.
Key Responsibilities
Design, develop, and maintain scalable data pipelines for data ingestion, transformation, and processing.
Lead and support migration efforts from Teradata to GCP, ensuring data accuracy and integrity.
Work on EDP 1.5 enablement, including migration between GCP projects.
Develop and optimize SQL queries for data extraction, transformation, and reporting.
Build and maintain data workflows using Python-based processing frameworks.
Collaborate with data architects, analysts, and business stakeholders to understand requirements.
Ensure efficient data storage and processing using GCP services.
Monitor data pipelines and troubleshoot performance issues.
Implement data quality checks and ensure governance standards are met.
Maintain documentation for data pipelines, processes, and architecture.
Required Skills & Qualifications
Data Engineering & Warehousing
7+ years of experience in:
Data Engineering
Data Warehousing concepts
Strong understanding of:
ETL/ELT processes
Data modeling and architecture
Cloud (GCP)
Hands-on experience with Google Cloud Platform (GCP)
Familiarity with services such as:
BigQuery
Cloud Storage
Dataflow / Dataproc (preferred)
Experience with cloud data migrations
Programming & Querying
Strong proficiency in:
Python
SQL (advanced query writing and optimization)
Migration Experience
Experience working on:
Teradata to cloud migration projects
Data platform modernization initiatives
Preferred Qualifications
Experience with large-scale enterprise data platforms
Familiarity with workflow orchestration tools (Airflow, Composer)
Knowledge of data governance and security practices
Experience working in Agile/Scrum environments
Strong analytical and problem-solving skills
Work Environment
Mostly remote with occasional onsite presence in Richardson, TX
Long-term engagement with potential extension into 2027
Fast-paced and collaborative team environment
Opportunity to work on modern cloud data platforms
Why Join This Opportunity?
Work on large-scale cloud migration projects (Teradata → GCP)
Gain hands-on experience with modern data engineering tools and platforms
Long-term stability with potential extension
Opportunity to work with experienced data professionals
High-impact role contributing to enterprise data transformation
Application Process
If you are a skilled Data Engineer with strong GCP, Python, and SQL experience, we encourage you to apply.
📩 Send your resume to: ragu@fcsglobal.us
To apply for this job email your details to ragu@fcsglobal.us