Big Data Engineer

  • Engineering
  • Hackensack, United States

Big Data Engineer

Job description

About Appeagle

Appeagle is a price intelligence company that provides strategic price automation, competitive insight, and data analytics to online retailers. Our mission is to empower sellers to make smart, data driven decisions by making complex pricing information easier to access and understand.

We are passionate about what we do and gravitate toward individuals who share our vision and desire to challenge the conventional way of thinking. We believe wholeheartedly in a work-life balance and providing the culture needed to nurture creative minds. We strive to hire people who are not only accomplished in their respective fields, bt who pursue passions in their personal time.

What’s the opportunity?

The Analytics Team is a highly committed group of data nerds who oversee data and business intelligence for Appeagle. We're looking for an engineer who is excited to build big data platforms for processing large data sets, design data warehousing and collect and load data from many sources. You will be able to amplify our product’s intelligence and advance customer performance metrics, all while working with the latest, leading-edge cloud-based technologies.

Requirements

What will I be doing?

If you are both a team player and independently resourceful, you will have the opportunity to lead us into the next generation of data-driven software. You have exceptional attention to detail, have an inquisitive approach to problem solving, and are able to maintain focus on the larger picture. As a member of our team you will focus on the mission-critical use of data in the support of Appeagle’s business and partners.

  • Coordinate adding new data sources
  • Design storage schemes in various AWS technologies
  • Support analysis initiatives by configuring AWS EMR clusters, Data Pipeline, Spark and other big data systems
  • Monitor and troubleshoot data flows, ETL/ELT’s, and maintenance processes
  • Guarantee integrity of data in big data systems

What skills do I need?

  • MapReduce and Hadoop experience (HDFS)
  • Experience with data warehousing (OLAP)
  • Extract Transform and Load (ETL) development and maintenance experience
  • Experience with data modeling
  • Understand business requirements and translate into technical specifications
  • Clearly communicate analysis results and recommendations
  • Collaborate and take ownership of work

Bonus skills and attributes​

  • At least 3 years relevant business intelligence / data warehouse experience
  • Experience with Amazon AWS data tools like EMR and Datapipeline
  • Experience with Spark, Hive, Pig or other MapReduce languages
  • Experience with Linux/UNIX, python, perl or ruby
  • Experience with machine learning or statistical analysis
  • Experience with database/sql performance tuning

Benefits

We believe wholeheartedly in a work life balance and are a well treated bunch. If there’s something important to you that’s not on this list, let’s talk about it! :)

  • Competitive salary and 401k plan + company match
  • Breakfast and lunch on the house daily. There is also a fully stocked fridge with drinks and snacks to keep the dreaded afternoon hunger attack at bay
  • Flexible vacation policy and paid holidays to keep you well rested
  • Paid Parental leave to let you spend valuable time with your loved ones
  • Health reimbursement plan
  • A work space that’s collaborative, not distracting...ok, you may get shot with a nerf dart
  • Supportive mentors that care about your growth