
Senior Data Engineer
- São Paulo - SP
- Temporário
- Período integral
- Huge payment processing company.
- Data Modeling: Design and implement data models that support efficient data storage, retrieval, and analysis;
- Data Quality Assurance: Implement best practices for data quality, integrity, and security to ensure reliable analytics and reporting;
- Data Warehousing/Data Lake: Design and manage the data warehouse or data lake, ensuring data is organized and optimized for reporting and analysis;
- ETL Processes: Develop and maintain efficient ETL processes to transform raw data into usable formats for reporting and analysis;
- Performance Optimization: Continuously monitor and optimize data pipelines and queries to ensure optimal performance and resource utilization;
- Collaboration: Work closely with product managers, software engineers, and other stakeholders to understand data requirements and translate them into effective data solutions;
- Innovation: Stay up-to-date with industry trends and emerging technologies in data engineering and analytics, and proactively suggest innovative solutions to enhance developer productivity;
- Technical Expertise: Deep understanding of data engineering principles, data modeling, ETL processes, and data warehousing/data lakes. Proficiency in SQL and relevant programming languages (e.g., Python, Java);
- Data Pipeline Tools: Experience with data pipeline tools such as Apache Airflow, Apache Kafka, or similar technologies,
- Cloud Platforms: Familiarity with cloud platforms such as AWS, Azure, or GCP, and their data services;
- Problem-Solving Skills: Strong analytical and problem-solving skills, with the ability to troubleshoot dat; issues and optimize data pipelines;
- Attention to Detail: Meticulous approach to data quality and accuracy, ensuring data integrity throughout the pipeline;
- Communication Skills: Ability to communicate technical concepts clearly and collaborate effectively with cross-functional teams.
- Bachelor's degree in computer science, engineering, or a related field;
- Senior of experience in data engineering, with a focus on data pipeline development and data architecture;
- Familiarity with data visualization tools (e.g., powerBI) and analytics frameworks;
- Strong Proficiency in SQL and experience with big data technologies (e.g., Hadoop, Spark, Kafka);
- Proficiency with at least one programming language (eg, Java);
- Experience with cloud platforms (e.g., AWS, GCP, Azure) and data storage solutions (e.g., Redshift, BigQuery, Snowflake);
- Customer-Centric Mindset with a track record of synthesizing customer insights, constructing product roadmap, demonstrated ability to empathize with the developers, and influence developers and stakeholders at all levels of the organization;
- Excellent problem-solving skills and the ability to work collaboratively in a fast-paced environment;
- It is important to be available to work temporarily for 6 to 9 months and to have good fluency in communication in English.