Staff Software Engineer, Big Data

Remote

Job Description / Skills Required

About Data Science Engineering (DSE) 

DSE at GoPro is responsible for our in-house data platform infrastructure, data engineering, and automated data analytics reporting. We are responsible for enabling and empowering our partners in product, engineering, product analytics, and marketing teams by providing infrastructure, tools, services, and visualization to access data and business reports. We are also responsible for preparing data and metrics to support data scientists and business operations.

About the role: 

As part of the DSE team, you will work on vast amounts of data from GoPro’s ecosystem. This includes our cameras, applications, cloud services, and web applications. You will enable different business units and the analytics team in making data-driven decisions by providing validated fit-for-purpose quality datasets that are a single source of truth across the organization.

What you will likely do:

  • Understand business requirements, assess the level of effort, and break down the development solution to the granular task level.
  • Work with business and engineering/solution teams to understand the upstream data sources / raw datasets and develop data models to build quality datasets.
  • Design, develop, test, deploy and support data pipelines to serve marketing, product engineering, e-commerce, and analytics use cases.
  • Lead design, implementation, and operations of data platforms and tools for ingesting, storing, processing, and querying data at scale.
  • Provide the technical leadership to design and develop highly reliable data products using big data ecosystem and software engineering best practices.
  • Create and maintain documentation and technical specification.
  • Create metrics and graphs to visualize and validate datasets.

About you:

  • Strong software development experience with proficiency in Scala or Java.
  • You are passionate about the architecture of the Big Data Technology stack consisting of layers: Data Modeling, Data Lakehouse, Data Pipeline, and Data Analytics.
  • Have experience in designing and building scalable/reliable data pipelines using the Big Data ecosystem (Hive/Spark/Databricks/Presto/Kafka/Airflow or equivalents).
  • Experienced in creating, modifying, and querying database entities (tables, views) using optimized SQL for performance and knowledge in data warehouse data models.
  • Experienced in the design/implementation of scalable and reliable services using AWS or other cloud services.
  • Knowledge in Machine Learning Model Operationalization (MLOps) is a plus.
  • You have the capability to synthesize business requirements and construct the technical requirements.
  • You are a strong problem solver with meticulous attention to detail and can tackle loosely defined problems.
  • You have excellent interpersonal skills and are passionate about collaborating with both technical and non-technical peers.

GoPro Highlights

  • Get your very own GoPro (Mounts and accessories included);
  • Competitive salary and discretionary annual performance-related bonus;
  • Gym fee compensation / LiveHealthy Wellness Program;
  • Discounted employee stock purchase plan;
  • Excellent healthcare insurance coverage;
  • Life insurance and disability benefits;
  • Professional + personal development opportunities, i.e. LinkedIn Learning;
  • Opportunities to get involved in the causes that you care about (annual camera donation + volunteer events)

We strive for the day that no group can be described as underrepresented at GoPro – whether as part of our brand or in our workforce. We are committed to providing a more inclusive, representative, equal, just and happy world. GoPro is proud to be an Equal Opportunity Employer.

#flexible #Data #Scala #Java #SQL #ETL #BigData