Data Engineer Job at EPITEC, Dearborn, MI

NEtrYm9Udm5ybGNLcGVjZGhacU1YQ3VYVEE9PQ==
  • EPITEC
  • Dearborn, MI

Job Description

W-2 POSITION. No C2C will be accepted.

Position Description: We are interested in hiring a Data Engineer. The candidate will be responsible for designing, developing the transformation and modernization of big data solutions on GCP cloud, integrating native GCP services, and building/enhancing data products in GCP. We are looking for candidates who have a broad set of technology skills across these areas and who can also demonstrate an ability to design and develop the right solutions.

Key Responsibilities:

  • Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successful deployment of Data Platform.
  • Implement methods for automation of all parts of the pipeline to minimize labor in development and production.
  • Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions.
  • Test and compare competing solutions and report out a point of view on the best solution.
  • Experience with large-scale solutioning and operationalization of data warehouses and analytics platforms on GCP.
  • Design and build production data engineering solutions to deliver pipeline patterns using Google Cloud Platform (GCP) services: BigQuery, DataFlow, Pub/Sub, DataProc, SQL, Compute Engine, Cloud Functions.
  • Orchestration tools like Apache Airflow.
  • Build new/enhance data products in GCP.

Skills Required:

  • Understands data architectures and design independent of the technology.
  • Experience with BigQuery, SQL.
  • Experience with Python, Apache Airflow.
  • Exceptional problem-solving and communication skills and management of multiple stakeholders.
  • Experience in working with Agile and Lean methodologies.
  • Experience with Test-Driven Development.
  • Exposure to AI/LLM.

Skills Preferred:

  • N/A.

Experience Required:

  • Minimum 5 years of experience in Java/Python in-depth.
  • Minimum 5 years of experience in data engineering pipelines/building data warehouse systems with the ability to understand ETL/ELT principles and write complex SQL queries.
  • Minimum 5 years of GCP experience working in GCP-based Big Data deployments (Batch/Real-Time) leveraging BigQuery, Google Cloud Storage, Pub/Sub, Data Fusion, Dataproc.
  • Minimum 2 years of experience in development using Data warehousing, Big Data Eco.
  • 3 years of experience deploying Google Cloud services using Terraform.

Experience Preferred:

  • Experience working with AI/LLM models, leveraging Generative AI.

Education Required:

  • Bachelors or masters in the required field.

Additional Information:

  • This is a hybrid position.

Job Tags

Similar Jobs

Scientific Infra & Private Assets (SIPA)

Sales Executive - Index Sales, Real & Private Assets Job at Scientific Infra & Private Assets (SIPA)

 ...include major investors, asset managers and consultants around the world and represent USD1Tr of private assets under management (AUM)....  ...actively researching leads and opportunities and extensive traveling in North America. Compensation is competitive but highly... 

NTT DATA, Inc.

Red Hat Linux Consultant Job at NTT DATA, Inc.

 ...us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Red Hat Linux Consultant to join our team in pune, Mahrshtra (IN-MH), India (IN). Job Description: We are looking for a highly skilled... 

TalentOla

Amazon Connect Developer Job at TalentOla

 ...Job Title - Amazon Connect Developer Location - Plano, TX/Wilmington, DE/Jersey City, NJ(Onsite) Job Description: We are seeking a highly skilled and experienced Senior Engineer in Amazon Connect and AWS Expert to join a dynamic team and assist with migration... 

Compunnel Inc.

Certified Medical Assistant Job at Compunnel Inc.

 ...GENERAL SUMMARY/ OVERVIEW STATEMENT: Under the supervision of the Practice Manager(s) and the Administrative Manager(s) the medical assistant is responsible for ensuring efficient patient flow while maintaining the highest level of patient safety, patient care, patient... 

Intelliswift - An LTTS Company

Linguist - Brazilian Portuguese Job at Intelliswift - An LTTS Company

Job Title: Linguist- Brazilian Portuguese Locations: Onsite at Burlingame, CA Duration: 12 Months Contract on W2 (Possible extension) Skills: Native speaker of Brazilian Portuguese or demonstrated near-native fluency Knowledge of phonetics, phonology, ...