Posted at: 12 January

DataOps Engineer (AWS)

Company

NorthBay Solutions

NorthBay Solutions is a global B2B technology consulting firm and AWS Premier Partner specializing in AWS-oriented professional services, including AI/ML and cloud migrations, serving diverse industries such as Automotive and Healthcare.

Job Type

Full-time

Allowed Applicant Locations

India

Job Description

Job Title: DataOps Engineer (AWS)

Experience: 5–7 Years
Employment Type: Full-Time
Work Mode: Remote
Location: India

Job Summary

We are looking for a skilled DevOps + Data Engineer (AWS) to design, build, and manage scalable cloud infrastructure and data platforms. The ideal candidate will have strong experience in AWS, Terraform, CI/CD, and ETL/data pipelines, and will work closely with data engineers, analytics teams, and application developers to ensure reliable, secure, and high-performing systems.

Key Responsibilities

  • Design, deploy, and manage AWS cloud infrastructure using Infrastructure as Code (Terraform)
  • Build, maintain, and optimize ETL/data pipelines for large-scale data processing
  • Automate infrastructure provisioning, deployment, and monitoring
  • Develop and manage CI/CD pipelines for data and application workloads
  • Ensure high availability, scalability, security, and cost optimization of AWS environments
  • Support data ingestion, transformation, and orchestration workflows
  • Monitor systems, troubleshoot issues, and improve system reliability
  • Implement logging, monitoring, and alerting for infrastructure and data pipelines
  • Collaborate with Data Engineers, Analytics teams, and DevOps teams
  • Enforce security best practices, IAM policies, and compliance standards

Required Skills & Qualifications

  • 5–7 years of experience in DevOps, Data Engineering, or Cloud Engineering
  • Strong hands-on experience with AWS services (EC2, S3, RDS, Lambda, EMR, Glue, Redshift, etc.)
  • Solid experience with Terraform for infrastructure automation
  • Experience building and maintaining ETL/data pipelines
  • Proficiency in Python and/or Shell scripting
  • Experience with CI/CD tools (GitHub Actions, Jenkins, GitLab CI, etc.)
  • Strong understanding of Linux systems and networking fundamentals
  • Experience with monitoring tools (CloudWatch, Prometheus, Grafana, ELK, etc.)

Preferred / Nice-to-Have Skills

  • Experience with Airflow, AWS Glue, or similar orchestration tools
  • Knowledge of Docker and Kubernetes
  • Experience with data warehouses and analytics platforms
  • Understanding of security best practices and data governance
  • Exposure to cost optimization and performance tuning on AWS