Posted at: 17 March

AWS DevOps Data Engineer

Company

CompanyRailroad19

Railroad19 is a Saratoga Springs, NY-based B2B company specializing in custom enterprise software development and cloud-native solutions for Fortune 500 clients, primarily in the media and entertainment industry.

Remote Hiring Policy:

Railroad19 is a fully remote company hiring developers across the United States, with no specific time zone requirements. Team members are distributed nationwide, supporting a flexible work environment.

Job Type

Full-time

Allowed Applicant Locations

United States

Salary

$120,000 to $140,000 per year

Job Description

AWS DevOps Data Engineer 

 

We’re looking for an experienced AWS DevOps Data Engineer to support a large enterprise media organization. This full-time role blends handson engineering with DevOps direction, AWS bestpractice guidance, and crossteam enablement.

About Railroad19, Inc

  • At Railroad19, Inc, we develop customized software solutions and provide software development services. We’re a specialized team of developers and architects. As such, we only bring an “A” team to the table, through hard work and a desire to lead the industry — this is our company culture — this is what sets Railroad19 apart.
  • As a Railroad19 employee, you will be part of a company that values your work and gives you the tools you need to succeed. Our headquarters is in Saratoga Springs, New York, but this position is 100% remote. Railroad19 provides competitive compensation and excellent benefits~ including Medical/Dental/Vision/Pet Insurance, Paid Time Off, and 401 (k).
  • NO 1099, C2C, Corp-to-Corp, only full-time employment.
  • NO Agencies

Core Responsibilities:

  • Build and support AWS infrastructure (EMR, ECS, Lambda, IAM)
  • Create POCs and reference implementations to guide teams on AWS best practices
  • Own and manage GitLab (runners, permissions, pipelines) and support the transition to GitHub Actions
  • Coordinate and enable multiple engineering teams to adopt consistent DevOps workflows
  • Develop Python‑based ETL/automation scripts
  • Build and maintain Airflow DAGs for data workflows
  • Improve CI/CD pipelines, automation, and platform reliability
  • Collaborate across engineering, data, product, and security teams in a large enterprise environment

Skills & Experience

  • 4+ years in cloud engineering, DevOps, or data engineering
  • Strong AWS experience (EMR, ECS, Lambda, IAM)
  • Python development skills
  • Airflow production experience
  • Hands‑on GitLab experience; GitHub Actions a plus
  • Ability to guide teams, influence standards, and operate in a complex enterprise setting

Nice to Have/Preferred

  • Media/entertainment industry experience
  • Terraform or CloudFormation
  • Spark or distributed data processing
  • Strong communication and documentation skills
  • AWS Certifications
$120,000 - $140,000 a year
This is a full-time employment, base annual salary, and bonus eligibility at the end of the fiscal year. Full Benefits- medical, dental, vision, pet insurance, 401k, and competitive paid time off.
We may use artificial intelligence (AI) tools to support parts of the hiring process, such as reviewing applications, analyzing resumes, or assessing responses. These tools assist our recruitment team but do not replace human judgment. Final hiring decisions are ultimately made by humans. If you would like more information about how your data is processed, please contact us.