Skip to content

Data Engineer - San Francisco, CA and Jersey City, NJ

  • Hybrid
    • Chicago, Illinois, United States
  • $65 - $70 per hour

Job description

Location: San Francisco, CA and Jersey City, NJ
Work Mode: Hybrid (3 Days Onsite / 2 Days Remote)
Employment Type: W2 Only (No C2C or 1099)
Candidate Requirement: Local candidates preferred (must be located near San Francisco or Jersey City)

As an AWS Data Engineer, you will be a key member of our data engineering team, responsible for building and managing scalable data solutions on the AWS platform. This role requires strong expertise in AWS technologies, big data frameworks, and data pipeline orchestration. You will work closely with data scientists, analysts, and business stakeholders to ensure data availability, quality, and governance.

Data Onboarding: Integrate diverse data sources into the AWS-based data lake with consistency and reliability.

  • Pipeline Development: Design and develop scalable data pipelines using services like AWS Lambda, Step Functions, and EMR.

  • Metadata Management: Register data assets and manage metadata to enhance data discoverability.

  • Data Quality: Implement validation checks and data transformation processes to maintain high data integrity.

  • Governance & Compliance: Ensure data practices align with security, privacy, and compliance standards.

  • Infrastructure as Code: Automate and manage cloud infrastructure using Terraform.

  • Big Data Processing: Utilize Apache Spark to process large-scale datasets efficiently.

  • Workflow Orchestration: Manage and automate data workflows using Airflow and AWS Step Functions.

  • Data Modeling: Work with Snowflake and Iceberg table formats to design optimized storage schemas.

  • Team Collaboration: Collaborate across departments to gather requirements and deliver robust data solutions.

Job requirements

  • Proven experience in AWS data engineering using Lake Formation, Lambda, EMR, and related services.

    • Strong proficiency in Python and Terraform scripting.

    • Familiarity with data orchestration tools such as Airflow and dbt.

    • Hands-on experience with Snowflake, RDS, Jupyter, and Iceberg formats.

    • Expertise in ETL design, data pipelines, and data warehousing concepts.

    • Understanding of data governance frameworks and best practices.

    • Strong problem-solving skills and ability to troubleshoot complex data systems.

    • Excellent communication and teamwork skills.

    Must-Have Skills

    • Cloud/Data Engineering: AWS (Lake Formation, Lambda, EMR, Step Functions, EC2, EKS)

    • Big Data & Processing: Spark, Data Lakes, Data Pipelining

    • Programming & Scripting: Python, Terraform

    • Orchestration & Workflow: Airflow, dbt

    • Data Modeling & Warehousing: Snowflake, Iceberg table formats

    • Tools: Jupyter Notebook, RDS

    Preferred Qualifications

    • Certifications such as:

      • AWS Certified Data Analytics – Specialty

      • AWS Certified Solutions Architect

    • Experience working in fast-paced, data-centric environments

    TopTech Talent is proud to be an equal opportunity workplace and is an affirmative action employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, citizenship status, disability, protected veteran status, gender identity or any other factor protected by applicable federal, state, or local laws.

    🚫 Third-party recruiters, please do not reach out for this role.

or