[< BACK]
// POSTED: Apr 13, 2026

Senior Data Engineer – Remote Data Architecture & Scalable ETL Development at arenaflex

APPLY NOW
Why arenaflex? arenaflex is a world‑renowned leader in entertainment, media, and technology, constantly pushing the boundaries of storytelling and digital experiences. With a portfolio that spans streaming services, sports broadcasting, and immersive interactive platforms, arenaflex leverages cutting‑edge data solutions to create magical moments for millions of fans worldwide. As the company continues to expand its digital footprint, the demand for visionary data professionals who can design, build, and operate massive data pipelines has never been higher. Our Remote Data Engineering team sits at the heart of this transformation. Working from the comfort of your own home, you will partner with product innovators, content strategists, and analytics scientists to turn raw data into actionable insight that powers everything from personalized recommendations to real‑time sports analytics. Position Overview arenaflex is seeking an experienced Senior Data Engineer to join the Item Execution and Instrumentation Group (IEIG). In this role, you will lead the design, development, and operational excellence of large‑scale data platforms that serve the entire arenaflex ecosystem. Your expertise in cloud technologies, data lake‑house architecture, and modern programming languages will enable the organization to deliver high‑performance data solutions that drive business value across multiple verticals. This is a full‑time, 100% remote position based in the United States, offering a competitive salary range of $35,000 – $40,000 per year plus a comprehensive benefits package. Key Responsibilities - Architect & Build Scalable Data Pipelines: Design, implement, and maintain robust ETL/ELT workflows in Scala, Python, and PySpark that process terabytes of data daily across cloud (AWS) and on‑premise environments. - Lakehouse Engineering: Drive the migration to a lakehouse‑driven data platform using Snowflake, Delta Lake, and Databricks, ensuring seamless integration with existing data marts. - Collaborate with Cross‑Functional Teams: Partner with Data Product Managers, Data Scientists, and Business Intelligence analysts to translate business requirements into technical solutions. - Maintain SLA Compliance: Monitor pipeline health, troubleshoot incidents, and continuously improve system uptime to meet strict Service Level Agreements (SLAs). - Documentation & Governance: Produce clear, up‑to‑date documentation of data models, pipeline architecture, and operational procedures to support data quality and governance initiatives. - Agile Participation: Actively contribute to Scrum ceremonies, sprint planning, and retrospectives, fostering a culture of continuous improvement. - Problem Solving & Innovation: Investigate emerging data challenges, propose automation opportunities, and optimize cost‑efficiency across the data stack. - Stakeholder Engagement: Build strong relationships with internal customers, translating complex technical concepts into understandable business value. Essential Qualifications - Bachelor’s degree in Computer Science, Software Engineering, Data Engineering, or a related technical field. - 5+ years of professional experience designing and operating large‑scale data pipelines. - Deep proficiency in SQL, with the ability to craft performant queries for complex analytical workloads. - Extensive hands‑on experience with Apache Spark (including PySpark) and Flink for real‑time stream processing. - Strong programming skills in Scala and Python, including best practices for code modularity and testing. - Solid background in AWS services such as S3, EMR, EC2, and IAM. - Demonstrated expertise with at least one major MPP or cloud data warehouse technology (Snowflake, Redshift, BigQuery, etc.). - Familiarity with data lakehouse concepts, Delta Lake, and Databricks orchestration. - Experience working within Agile/Scrum frameworks and a commitment to collaborative delivery. - Excellent communication and interpersonal skills, capable of influencing cross‑functional teams. Preferred Qualifications & Additional Skills - Master’s degree or advanced certifications in data engineering, cloud architecture, or big data technologies. - Hands‑on experience with data visualization tools (Tableau, Looker, Power BI) and supporting data pipelines for analytics. - Knowledge of CI/CD pipelines for data engineering (e.g., GitHub Actions, Jenkins, CircleCI). - Familiarity with containerization (Docker, Kubernetes) and infrastructure‑as‑code (Terraform, CloudFormation). - Exposure to machine learning workflows and model‑serving pipelines. Core Skills & Competencies - Analytical Thinking: Ability to break down complex problems, design elegant solutions, and anticipate downstream impacts. - Performance Optimization: Expertise in tuning Spark jobs, SQL queries, and storage formats for speed and cost‑effectiveness. - Data Quality Advocacy: Commitment to data profiling, testing, and monitoring to ensure trustworthy datasets. - Collaboration: Proven track record of working effectively with product, engineering, and business stakeholders. - Adaptability: Comfort navigating fast‑changing priorities in a high‑growth, innovative environment. Growth & Development Opportunities arenaflex invests heavily in the professional development of its team members. As a Senior Data Engineer, you will have access to: - Annual learning stipend for conferences, certifications, and online courses. - Mentorship programs pairing you with senior architects and data science leaders. - Opportunities to lead high‑visibility projects that influence company‑wide data strategy. - Clear career pathways toward Data Architecture Lead, Head of Data Engineering, or Product Management roles. Work Environment & Culture At arenaflex, we pride ourselves on a vibrant, inclusive culture that celebrates creativity and technical excellence. Our remote‑first policy means you can work from anywhere in the United States while staying tightly connected to a globally distributed team. Highlights of our culture include: - Flexibility: Flexible work hours that respect work‑life balance and different time zones. - Collaboration: Regular virtual coffee chats, team‑building events, and cross‑department hackathons. - Diversity & Inclusion: Employee resource groups, inclusive hiring practices, and a commitment to equitable growth. - Innovation‑Driven: An environment that encourages experimentation, rapid prototyping, and data‑informed decision‑making. Compensation, Perks & Benefits arenaflex offers a competitive compensation package designed to attract top talent. While the base salary range for this role is $35,000 – $40,000 per year, total rewards may include: - Performance‑based bonuses tied to project milestones and business impact. - Comprehensive health, dental, and vision insurance. - Retirement savings plan with company matching contributions. - Paid time off, parental leave, and generous holiday schedule. - Remote work allowance for home office setup, high‑speed internet, and ergonomic equipment. - Access to streaming and entertainment subscriptions as part of our brand heritage. How to Apply If you are passionate about building data ecosystems that power unforgettable experiences and you thrive in a remote, fast‑paced environment, we want to hear from you. Click the link below to submit your application, including a resume and a brief cover letter highlighting your most relevant achievements. Apply Now – Join arenaflex’s Data Engineering Team! Take the Next Step with arenaflex At arenaflex, your work will directly influence the stories that captivate audiences worldwide. Join us to shape the future of entertainment through data, unlock new possibilities for creators and fans, and grow your career alongside industry pioneers. Apply today and become a cornerstone of our data‑driven journey.
Interested in this role?Apply on iHire