
Website L'Oréal
Job Description:
We are looking for an experienced GCP Data Engineer & Architect who has applied technology solutions to grow and optimize digital analytics. You would be responsible for executing a growth strategy for multiple divisions and channels across L’Oreal USA. You would apply in depth knowledge of technology options, technology platforms, design techniques and approaches across the L’Oreal ecosystem to design systems that meet business needs. You will analyze the business problems and design reliable, scalable data storage and analytics systems that enable the business and our products to scale. Play a leading role in POCs and building platforms and datasets using software engineering best practices, data management fundamentals, data storage principles, and operational excellence best practices.
Job Responsibilities:
- Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create and maintain optimal data pipeline architecture
- Assemble large, complex data sets that meet functional / non-functional business requirements.
- Support continuous improvement in DevOps Automation
- Develop data and semantic interoperability specifications
- Participate in architectural discussions, perform system analysis which involves a review of the existing systems and operating methodologies. Participate in the analysis of newest technologies and suggest the optimal solutions which will be best suited for satisfying the current requirements and will simplify the future modifications
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Design and Build the necessary infrastructure for optimal ETL from a variety of data sources to be used on GCP services
- Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.
- Develop proper Data Governance and Data Security
- Collaborate with multiple stakeholders including Product Teams, Data Domain Owners, Infrastructure, Security and Global IT
- Design appropriate data models for the use in transactional and big data environments as an input into Machine Learning processing.
- Identify, Implement, and continuously enhance the data automation process
Job Requirements:
- Knowledge and proven use of contemporary data mining, cloud computing and data management tools including but not limited to Microsoft Azure, AWS Cloud, Google Cloud, Hadoop, HDFS, mapr and spark.
- Proven knowledge in implementing security & IAM requirements
Qualification & Experience:
- Experience in Cross cloud, Inter-region data transfers/migration will be a plus
- 10 + years of Data Engineering experience working with both distributed architectures, ETL, EDW and Big Data technologies
- Experience with DWBI modelling framework is highly regarded
- 3+ years of experience with Google Cloud Services such as Streaming + Batch, Cloud Storage, Cloud Dataflow, Composer, DFunc, Big Query & Big Table
- Strong Python, Java Development Experience
- Proven Experience in Snowflake will be a plus
- Experience in Analytics/Visualization tools like PowerBI, Looker, Tableau
- Experience with DBT, Integration with Frontend Tools
Job Details:
Company: L’Oréal
Vacancy Type: Full Time
Job Location: New York, NY, US
Application Deadline: N/A
Jobsfiller.net