Data Engineer
Posted 2025-06-16Job title: Data Engineer in USA at Agero
Company: Agero
Job description: About Agero:Wherever drivers go, we're leading the way. Agero's mission is to rethink the vehicle ownership experience through a powerful combination of passionate people and data-driven technology, strengthening our clients' relationships with their customers. As the #1 B2B, white-label provider of digital driver assistance services, we're pushing the industry in a new direction, taking manual processes, and redefining them as digital, transparent, and connected. This includes: an industry-leading dispatch management platform powered by Swoop; comprehensive accident management services; knowledgeable consumer affairs and connected vehicle capabilities; and a growing marketplace of services, discounts and support enabled by a robust partner ecosystem. The company has over 150 million vehicle coverage points in partnership with leading automobile manufacturers, insurance carriers and many others. Managing one of the largest national networks of service providers, Agero responds to approximately 12 million service events annually. Agero, a member company of The Cross Country Group, is headquartered in Medford, Mass., with operations throughout North America. To learn more, visit .Role Description & Mission:We are seeking a Data Engineer who is passionate about data and eager to make a meaningful impact. In this role, you will design, build, and maintain the core data infrastructure that powers our analytics, machine learning, and data science initiatives. Your responsibilities will include optimizing data management processes, ensuring data quality and reliability, and developing scalable, efficient data models to support advanced analytics and data-driven decision-making. Success in this role requires a strong technical foundation, a collaborative mindset, and a drive to deliver innovative and impactful solutions.Key Outcomes:
- Data Pipelines:
- Develop and maintain robust ETL/ELT pipelines to ingest data from diverse sources (relational and NoSQL databases, APIs, etc.), including implementing best practices for real-time and batch data ingestion.
- Create and optimize data workflows using modern orchestration tools (e.g., Apache Airflow, Snowflake Tasks, Dagster, Mage).
- Cloud Cost Optimization:
- Monitor and optimize cloud costs (e.g., AWS, Snowflake) by analyzing resource usage and implementing cost-saving strategies.
- Perform query optimization in Snowflake to reduce compute costs and improve performance.
- Data Foundations:
- Develop and maintain modern data architectures, including data lakes and data warehouses (e.g., Snowflake, Databricks, Redshift), considering trade-offs of different data storage solutions and ensuring alignment with business requirements and SLAs.
- Data Modeling & Transformation:
- Apply dimensional modeling techniques (Kimball), star and snowflake schemas, and normalization vs. denormalization strategies based on use cases.
- Develop transformations using DBT (Core or Cloud), Spark (PySpark), or other frameworks.
- Collaborate on emerging approaches such as data mesh or specialized templates (e.g., Jina) to handle complex data needs.
- Coding:
- Write reusable, efficient, and scalable code in Python, PySpark, and SQL.
- Integrate serverless computing frameworks or modern API frameworks to support data-driven applications (FastAPI, Flask).
- Develop and maintain data-intensive UIs and dashboards using tools like Streamlit, Dash, Plotly, or React.
- Data Quality Control:
- Establish data governance and data quality frameworks, using either custom solutions or popular open-source/commercial tools (e.g., DBT tests, Great Expectations, Soda).
- Implement data observability solutions to monitor and alert on data integrity and reliability (e.g., Monte Carlo, Alation, or Elementary).
- Define SLAs, SLOs, and processes to identify, troubleshoot, and resolve data issues.
- Teamwork:
- Work cross-functionally with data scientists, analysts, and business stakeholders to translate requirements into robust data solutions.
- Follow and advocate for best practices in version control, CI/CD.
- Document data flows, processes, and architecture to facilitate knowledge sharing and maintainability.
- Extensive experience with Snowflake (preferred) or other cloud-based data warehousing solutions like Redshift or BigQuery.
- Expertise in building and maintaining ETL/ELT pipelines using tools like Airflow, DBT, Fivetran, or similar frameworks.
- Proficiency in Python (e.g., Pandas, PySpark) for data processing and transformation.
- Advanced SQL skills for querying and managing relational and NoSQL databases (e.g., DynamoDB, MongoDB).
- Solid understanding of data modeling techniques, including dimensional modeling (e.g., star schema, snowflake schema).
- Knowledge of query optimization and cost management strategies for platforms like Snowflake and cloud environments.
- Experience with data quality and observability frameworks (e.g., Great Expectations, Soda).
- Proven expertise in designing and deploying data solutions in the cloud, with a focus on AWS services (e.g., EC2, S3, RDS, Lambda, IAM).
- Experience in building and consuming data-intensive APIs using frameworks like FastAPI or Flask.
- Familiarity with version control systems (e.g., Git) and implementing CI/CD pipelines.
- Strong communication and collaboration skills with the ability to explain technical concepts to both technical and non-technical audiences.
- Ability to manage multiple priorities and work independently.
- Experience with data streaming platforms such as Kafka or Kinesis.
- Familiarity with Agile methodologies (Scrum, Kanban) and IaC tools like Terraform or CloudFormation.
- Knowledge of emerging technologies or frameworks in the data engineering ecosystem, such as Delta Lake, Iceberg, or Hudi.
- United States: AZ, CA, FL, IL, KY, MA, MI, NM, NH, TN, GA, NC, VA
- #LI-REMOTE
- Health and Wellness: Healthcare, dental, vision, disability, life insurance, and mental health benefits for associates and their families.
- Financial Security: 401(k) plan with company match and tuition assistance to support your future goals.
- Work-Life Balance: Flexible time off, paid sick leave, and ten paid holidays annually.
- For Contact Center Roles: Accrual of up to 3 weeks Paid Time Off per year, paid sick leave, and ten paid holidays annually.
- Family Support: Parental planning benefits to assist associates through life's milestones.
- Bonus/Incentive Programs
Expected salary: $97482 - 140000 per year
Location: USA
Apply for the job now!
[ad_2]
Apply for this job