The members of our Data Engineering department build systems and infrastructure for collecting, storing, and analyzing huge sets of data in batch and streaming pipelines. As a sub-team within Data Engineering, the Business Data team is responsible for the quality, content, and timely delivery of our core analytics and business intelligence data sets. We build and maintain tools to help the rest of the company find, transform, store, and visualize their data. We value curiosity, enthusiasm, responsibility and generosity of spirit.
We’re looking for a skilled engineer who cares about the data that flows through pipelines and the internal customers who will use that data. We work in Scala, Python, PHP, Java, and SQL, and we work with technologies like Hadoop, Kafka, Scalding, Airflow, Avro/Thrift, Vertica, Looker, GCS, Dataproc, and BigQuery. Experience with any of these is helpful but not required.
About the Role:
- Our team is responsible for the daily delivery of hundreds of business-critical datasets in batch and streaming.
- We help define, maintain, and monitor the ETLs that generate our core business analytics datasets.
- We build and maintain internal tools for finding, querying, transforming, and visualizing data.
- We support the company’s analytics database and work to establish best practices for users of it.