We are looking for a Data Engineer design and build data warehouses on cloud, to provide efficient analytical and reporting capabilities across global and regional sales and finance teams. As the Data Engineer, you will be required to understand existing solutions, fine-tune them and support them as needed. The Data Engineer will:
* develop highly scalable data pipelines to load data from various source systems, use Apache Airflow to orchestrate, schedule and monitor the workflows
* Build generic and reusable solutions meeting data warehousing design standards for complex business requirements.
Data quality is the goal and the expectation will be to meet high standards on data and software quality. You will join a rapidly growing team with plenty of interesting technical and business challenges to solve; therefor, you must be a self-starter, who is willing to learn fast, adapt well to changing requirements and work with cross functional teams.
* 6+ years of hands-on data modeling and data engineering experience
* Strong expertise in dimensional modeling and data warehousing
* Database design and development experience with relational or MPP databases such as Postgres/ Oracle/ Teradata/ Vertica
* Experience in design and development of custom ETL pipelines using SQL and scripting languages (Python/ Shell/ Golang)
* Proficiency in advanced SQL, performance tuning
* Hands on experience with Big-Data platform like Hadoop, MapReduce, Hive etc.
* Experience with cloud computing platforms like AWS, Google Cloud
* Familiarity with version control and migration tools for database and software
* Experience working with APIs will be a plus
* Ability to learn and adapt to new tools and technologies
* Analytical and mathematical mind, capable of evaluating and solving various complex problems
* Ability to work individually or as part of a team
* Ability to learn quickly in a fast-paced environment
* Excellent oral and written communication skills
* BS or MS in Engineering/ Computer Science