Data Engineer | Top USA Data Engineering & Cloud ETL Jobs
Apply for high-paying Data Engineer jobs across the USA Remote, hybrid, and onsite roles in data pipelines, cloud engineering, ETL, big data, and analytics. Apply now
A Data Engineer would always be responsible for assisting delivery teams with regard to the design and development of several multi-tier web-based applications. The work will mostly follow the agile methodology but might be mixed with some scrum waterfall or Kanban-based approaches. Daily responsibilities shall include collaborating closely with other developers and testers to adhere to standard quality practices and standards on coding, testing, modifications, and documentation.
This is a critical position supporting investment management data and portfolio accounting; hence experience in having previously worked within this domain proves an added advantage.

Minimum Qualifications
Data Engineer Bachelors degree in Computer Science or related discipline (such as Engineering or Mathematics)
3-5 years of professional experience with Azure data loads, pipelines, functions dB, etc.
Advanced experience regarding SSIS ETL process writing
3-5 years of detailed C#, Python, and JavaScript back-end programming experience
10 years hands-on expertise in SQL
Experience competence in Data Entities testing API consumption utilization endpoint postman/Swagger
Competent Knowhow dealing with Microsoft Dynamics environments inclusive invoking batch jobs exporting schedules tables varying manipulating keys different resource sharing options running Console.
Possess logical business knowledge by integrating finoperf.db table views financial analytical tools into a test_dev environment connection string accessing methods creating classes linking SQL scripts consuming models inheriting them namespaces depending references setting strings getting imports
Preferred Qualifications
Performance Tuning existing high proficient stored procedures/views/functions/jobs as per analysis suggestions implementations reviewing comparing code regions migrations between servers installations studying impacts limitation enhancements considering pre-execution trail runs debugging finalizing problem statements modeling versioning indicating risk registering modifications.
Hands-on expert on pythonpandas involving nested class inheritance reading parsing manipulating databases serialized key record dictionary replacer splitters regex mapping datizes timezones merging going filling down backward venting lengths changing reshaping grouping pressure along rolling windows renaming endings scalers listing series graphing Inkward scanner stretching merging back sliding window statistics printing matching analyzing all differences removing duplicators among many others leading advocaat conversion over indexes multiple alignments!
Informatica bug logging trace inspection resolving automation metadata package usage understanding accidental report checking Agents Configurator utilities identifying state collectors alterations Impacted application reports changes identified comparisons contacting Teams involved experienced support group Members Coordinator!
The applicant must possess knowledge conference skill sets whereby mapped implementations enhance enormously welcomed found schemes wealth assets trustees holderships accounted renamed funds subtle functionalities reconciliatory reference attributes appended retention classes upgraded functionors reconcilers debris documentational however ideally placed head quest flagged Oracle Financial developer planners systems security analysts ERP accountancy INFORMATION SCHEM
Data Engineer (Azure) – 8+ Years Experience
Location Scottsdale, AZ Remote Work Job Type C2C C2H W2 Experience Required 8+ Years Must Have Skills Azure, Python, SQL, C#, Wealth Management Domain
Original job description reads here:
We want a very good and skilled Data Engineer who has much knowledge of Azure cloud solutions, Python, SQL, C#, and the Wealth Management domain. This job is remote from Scottsdale, AZ so it gives you flexibility while letting you work with some of the best engineering and analytics teams. The right person should have lots of hands-on work making big data pipelines, bringing together data from different money-related systems, making data setups better, and keeping top-level rules and quality.
Work with data architects, business analysts, wealth management SMEs, and other cross-functional team members to implement high-performance analytical, dashboarding, regulatory reporting, and digital transformation data solutions. Participate in the establishment of a golden source made available at the right time with variant high quality enabling business value.
Data Pipeline and ETL Engineering
Develop, implement, and support highly reliable ETL/ELT pipelines through the use of Azure Data Factory, Databricks, Functions, and Synapse Analytics. Setup workflow orchestration and automation processing that supports scalable data ingestion of structured, semi-structured as well as unstructured sources. Apply Python, SQL, and C# to develop complex transformation logics as well as enterprise-level automation Implement performance-and-cost-optimized-reliable end-to-end data flows.
Build and implement data solutions in the cloud using Azure Storage, ADLS Gen2, Azure SQL, Synapse, Key Vault, Event Hubs, and Service Bus as components for native applications to be developed on the platform. Develop secure access patterns and a comprehensive RBAC model including enterprise policies for any data system on the cloud. Monitor workloads on Azure and tune it for performance and availability as well as security.
Work with business stakeholders in the areas of portfolio management, investment operations, trading, financial advisory, and client reporting. Know wealth management data models such as accounts, holdings, trades and transactions, reconciliation and performance calculation and risk metrics together with KYC data and compliance reporting. Implementing data solutions supporting wealth advisory workflows, financial planning workflows, and investment analytics workflows.
Conceptual, Logical, and Physical Data Model development will be in compliance with the standards at the enterprise level.
Support modernization of data architecture in the journey to the cloud and replace legacy ETL platforms. Create dimensional models, star schemas, and data marts for reporting and business intelligence (BI) ecosystems.
Deliver high-quality data by setting up valid rules, spotting anomalies, and running automatic quality checks. Take part in business data oversight activities including <metadata management, tracking the origin, listing, and recording of information. Keep up with standards for protecting data and following financial rules.
Work With Others & Agile Delivery
Join in on the Agile rituals, plan sprints, look over code, and chat about designs. Work with different groups, data experts, and coders to make sure solutions get done from start to finish. Teach, guide technically, and share ideas with junior data engineers.
Needed Skills
* 8+ years of real, hands-on work as a Data Engineer.
* Strong coding skills in Python, SQL, and C#, able to create optimized, scalable, and ready-for-production code.
Proven expertise with the Azure Data Engineering tools (ADF, Databricks, Synapse, ADLS, Azure SQL, and others). Strong background in ETL/ELT pipeline design and big data management. Good knowledge of wealth management business areas covering financial instruments, investment operations, and advisory platforms. Hands-on experience with application optimization, trouble-shooting, CI/CD, and DevOps. Very good verbal articulation as well as logical analysis and issue-resolution capacity; self-driven to function offsite within a distributed group.
Worked with Azure DevOps, GitHub Actions, or Jenkins in automating CI/CD.
Know big data technologies (Spark, Delta Lake, Kafka).
Know any data visualization tool like Power BI or Tableau.
Have worked before in highly regulated industries, such as finance or banking.
strongMicrosoft Azure Data Engineer Associate Azure Solutions Architect
Be involved with new-generation cloud-based data solutions inside a financial organization undergoing digital transformation.
Work a flexible remote job with some of the best engineering brains. Use your Wealth Management experience for critical business decisions and support experiences that face clients. Long-term growth potential on challenging, enterprise-level data engineering projects. Competitive pay; C2C, C2H, and W2 options available.
It will suit a Senior Data Engineer who likes working in creative cloud environments and enjoys building data pipelines that help in the realization of financial insights. Apply to join a high-performing team delivering modern data capabilities for wealth management transformation.
Please share your resume at anisha@nexgeniots.com
Search More Jobs>>> https://diceusajobportal.com
Explore more jobs>>>>>>>>> https://www.randstadusa.com/jobs/4/1318012/data-engineer_charlotte/