Data Scientist Location : Atlanta, GAWindsor CT – Local Candidates only

By USA JOB Finder

Published On:

Join WhatsApp

Join Now

Join Telegram

Join Now

Data Scientist Location : Atlanta, GA Windsor CT – Local Candidates only

Job Title: Data Scientist

Location: Atlanta, GA/Windsor, CT – Only Local Candidates

Duration: Long Term Contract

Key Responsibilities

Data Wrangling & Feature Engineering– Ingest, clean, and transform data coming from SQL, APIs, and data lakes like Snowflake and Databricks. Build solid pipelines as supports analytics and machine learning workflows.

Work with domain and subject matter experts to deeply capture the meaning, context, quality, and limitations of datasets available. Translate business questions into data requirements and analytics approaches.

Build, tune, and validate predictive models using scikit-learn or SparkML – XGBoost – TensorFlow.

Work with marketing, sales, and product teams in defining the business use cases and success metrics as well as implementing the models into the operational workflow.

Set up and handle models with the use of MLflow, Docker, and CI/CD pipelines. Versioning, testing, performance monitoring, and retraining plans are to be developed as parts of a whole MLOps practice.
Work with data engineering and DevOps on keeping up plus boosting model training and sending the setup of compute tools, steps control, and setting tuning.

Deliver insights with clear actionable reporting͏ and visuals from tools such as Power BI or Tableau. The focus should relate to impact and not just analysis.

Skills Required:

Bachelor’s degree in data science, computer science, engineering, or any other related quantitative field.

5+ years of experience in data science, machine learning engineering, or analytics.

Great programming skills in SQL͏ Python and machine learning techniques.

Has worked with Azure Cloud, Databricks, and/or Snowflake.

Has a record of building and deploying machine learning models into production environments with practical experience using Databricks, SparkML, and MLflow integration.

Understands version control and model monitoring as part of the best MLOps practices. Has used tools such as Git, MLflow, Docker, and any workflow scheduler before.

Can explain complicated technical issues to stakeholders who are not technical. Has worked in scalable model training environments and distributed computing.

Background in financial services, fintech, or enterprise B2B analytics. Understanding A/B testing, causal inference, and statistical experimentation. Know GenAI, LLM pipelines and vector-based retrieval helps. Also, know platforms like Snowflake Cortex.

Email E-mail- Raju.d@globalapplications.com

Leave a Comment