Job Description
Industry: Technology, Information and Internet, Software Development, and Retail
Seniority for this role: Mid-Senior level
We’re always looking for the ones truly passionate about their work. If you are amongst them, you can rest assured there is a place for you in eMAG. We’ve grown very fast and we are determined to keep doing so. What brought us here is our desire for continuous evolution and practical results. More than 6000 colleagues are part of eMAG Teams . We strongly believe in people development and therefore every year we invest more and more energy and resources to remain an organization that is constantly learning. We want to make sure that you’ll have the most talented colleagues, as well as the proper environment to grow and achieve great results, to become what you desire on a personal and professional level. Join us, grow faster ! At the beginning of your journey in eMAG, you will receive a crash course in architectural guidelines and you will have the opportunity to work on some architecture-related initiatives. Afterward, you will be responsible for maintaining the same level of symmetry and standards in our domain data applications. What you’ll have to do: Develop domain-driven data applications using the following technologies: Python, PySpark, SQL, Jupyter, Gitlab, Apache Airflow, Apache Iceberg; Orchestrate assets (timing and dependencies) in Apache Airflow; Create various reports or complex queries using our internal Data Platform; Optimize existing reports, improve application performance or speed; Enforce the architectural guidelines provided to you. How we’d like you to be: Have strong knowledge of SQL; Have experience with at least one programming language; Have Spark experience and knowledge of Git; Have implemented in the past any kind of tests on your code/ data: unit tests, integration tests, data quality tests, etc; You know your way with a Unix-based command-line terminal; Experience with a reporting tool like Looker, Qliksense, Tableau, etc; Experience with working on a data warehouse or lakehouse solution. Have advanced English proficiency. Will be a plus : Familiarity with applying design patterns and abstractions in code; Experience with DDL and DML operations and strategies; You know the importance of coding standards; Experience with any the following technologies: Apache Airflow, Apache Iceberg, Impala, Dremio or other query engine, Gitlab, Jupyter, Datalake Experience; Agile experience. What we’ve prepared for you: Medical subscription: Medicover, MedLife or Regina Maria. A flexible budget that you can invest in yourself as you wish: meal tickets, holiday tickets, cultural vouchers, private pension, foreign language classes, eMAG, Fashion Days, Tazz, Therme & Genius, membership to different gyms or even professional development classes. Different discounts from our partners: banking, mobile, dental medicine or wellness. Access to the Bookster library and free credits on the Atlas psycho-emotional health platform. An accelerated learning environment, with access to over 100.000 curated online resources and platforms, learning academies and development programs. New headquarters, where sleek design, natural light, and versatile spaces create an energizing and comfortable environment for hybrid work. Curious to find out more about the next step in your career? Apply now and if your experience is relevant for the role you wish, we will give you a call for more details! Show more Show less