Job Descritpion of Data Warehouse and Data Modelling
10+ Years Relevant Experience
- Min 5 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms.
- 10 + years of proven experience with SQL, schema design and dimensional data modelling.
- Solid knowledge of data warehouse best practices, development standards and methodologies.
- Experience with ETL/ELT tools like ADF, Informatica, Talent etc., and data warehousing technologies like Azure Synapse, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc.
- Strong experience with big data tools (Databricks, Spark etc…) and programming skills in PySpark and Spark SQL.
- Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment.
- Excellent communication and teamwork abilities.
Nice-to-Have Skills
- Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge.
- SAP ECC /S/4 and Hana knowledge.
- Intermediate knowledge on Power BI.
- Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes.
Required Skills for Data Warehouse and Data Modelling Job
- Azure
- AWS
- GCP
- Data Bricks
- SQL
- ETL/ELT
- ADF
- Informatica
- Talend
- Zure Synapse
- Azure SQL
- Amazon Redshift
- Snowflake
- Google Big Query
Our Hiring Process
- Screening (HR Round)
- Technical Round 1
- Technical Round 2
- Final HR Round