My client is an American multinational food manufacturing company headquartered in United States.
Overview:
- Full stack data engineer is expected to play a pivotal role in operationalizing the most-urgent data
- and analytics requirements for our client AMEA's digital business initiatives.
- The primary work would be in managing and developing data pipelines, data models and data visualizations for key dataand analytics consumers (business/data analysts and data scientists).
- The ideal candidate would have 8-10 years of experience in data engineering and visualization,
- individual enjoys building data systems and designing them for future scalability and growth.
- This role will be the key interface in operationalizing data and analytics in partnership with business units.
- This role will require both creative and collaborative working with IT and the wider business functions.
Responsibilities:
Governance:
- Participates in the Data & Analytics and Enterprise applications Governance process.
Consulting:
- Provides strategic consultation to business and IT teams.
- Participates in quality reviews and provides feedback.
- Works with business leaders to understand business requirements and help them understand how technology tradeoffs influence strategy.
Oversight:
- Assists in post-implementation continuous-improvement efforts to enhance performance and provide increased functionality.
- Ensures the conceptual completeness of the technical solution.
- Works closely with project management to ensure alignment of plans with what is being delivered.
Build data pipelines:
- Delivery of data pipelines.
- Consisting of series of data flow stages: from data sources or endpoints of acquisition to integration to consumption for specific use cases.
- Data Visualization: Creation of effective dashboards following the process of design, POC
- (wireframe), development, deployment, and continuous improvement.
Data Analysis & Governance:
- Analyze large datasets to identify trends, patterns, and insights. Design and develop interactive dashboards and reports using data visualization tools like Tableau, Power Bl, etc. Convert complex data into easily understandable formats for non-technical audiences.
- Combine data from various sources to provide a comprehensive view for better decision-making. Ensure the accuracy and integrity of data used for visualizations.
Collaborate across departments:
- The role will work in close relationship with business, BU IT and business (data) analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements. Work closely with data analysts, data scientists, and stakeholders to ensure visualization meets functional and aesthetic requirements.
- Work is performed in from home office and occasional work from office.
- Design, build, and maintain scalable and robust ETL pipelines.
- Develop data architecture and models to support real-time and batch data processing.
- Optimize database queries and implement solutions for high-performance requirements.
- Design, implement, and manage robust, scalable solutions on AWS, specifically leveraging
- Clue, S3, Redshift, Lambda, EventBridge, and Step Functions.
- Utilize AWS Lambda and EventBridge for event-driven architecture.
- Use Step Functions for orchestrating distributed applications and microservices.
- Design and implement complex SQL queries for analytics and data manipulation.
- Create normalized and denormalized models depending on use-case requirements.
- Optimize query performance through indexing, partitioning, and other database techniques.
- Develop scalable and maintainable Python code for data processing, API interactions, and business logic.
- Requires exposure to multiple, diverse technologies and processing environments
Requirements:
- At least 8-10 years or more of work experience in data management disciplines including solution architecture, data integration, modeling, optimization and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks.
- A bachelor's or master's degree in computer science, statistics, applied mathematics, data management, information systems, information science or a related quantitative field or equivalent work experience is required.
- Some international travel is required.
- Knowledge, Skills and Experience
- Should have good understanding of business processes
- Strong experience with relational SQL, SAP BODS, SAP ERP (ECC/HANA)
- Strong experience in AWS services, PBI, Power Apps and Tableau
- Strong ability to provide direction to design, build and manage data pipelines for data
- structures encompassing data transformation, data models and schemas
- Education and Training
- UX/UI skills would be an added advantage.