Data Engineer
- Biguaçu - SC
- Permanente
- Período integral
- Our mission is to make researching and enrolling in schools easy, transparent, and free.
- With in-depth profiles on every school and college in America, 140 million reviews and ratings, and powerful search tools, we help millions of people find the right school for them.
- We also help thousands of schools recruit more best-fit students, by highlighting what makes them great and making it easier to visit and apply.
- Niche is all about finding where you belong, and that mission inspires how we operate every day.
- We want Niche to be a place where people truly enjoy working and can thrive professionally.
- About the Role Niche is looking for a skilled Data Engineer to join the Data Engineering team.
- Youll build and support data pipelines that can handle the volume and complexity of data while ensuring scale, data accuracy, availability, observability, security, and optimum performance.
- Youll be developing and maintaining data warehouse tables, views, and models, for consumption by analysts and downstream applications.
- This is an exciting opportunity to join our team as were building the next generation of our data platform, and engineering capabilities.
- Youll be reporting to the Manager, Data Engineering (Core).
- What You Will Do Design, build, and maintain scalable, secure data pipelines that ensure data accuracy, availability, and performance.
- Develop and support data models, warehouse tables, and views for analysts and downstream applications.
- Ensure observability and quality through monitoring, lineage tracking, and alerting systems.
- Implement and maintain core data infrastructure and tooling (e.
- g.
- , dbt Cloud, Airflow,RudderStack, cloud storage).
- Collaborate cross-functionally with analysts, engineers, and product teams to enable efficient data use.
- Integrate governance and security controls such as access management and cost visibility.
- Contribute to platform evolution and developer enablement through reusable frameworks, automation, and documentation.
- What We Are Looking For Bachelors degree in Computer Science, Data Science, Information Systems, or a related field.
- 3-5 years of experience in data engineering.
- Demonstrated experience of building, and supporting large scale data pipelines streaming and batch processing.
- Software engineering mindset, leading with the principles of source control, infrastructure as code, testing, modularity, automation, CI/CD, and observability.
- Proficiency in Python, SQL, Snowflake, Postgres, DBT, Airflow.
- Experience of working with Google Analytics, Marketing, Ad & Social media platform, CRM/Salesforce, and JSON data; Government datasets, and geo-spatial data will be a plus.
- Knowledge and understanding of the modern data platform, and its key components ingestion, transformation, curation, quality, governance, and delivery.
- Knowledge of data modeling techniques (3NF, Dimensional, Vault).
- Experience with Docker, Kubernetes, Kafka will be a huge plus.
- Self-starter, analytical problem solver, highly attentive to detail, effective communicator, and obsessed with good documentation.
- First Year Plan During the 1st Month: Immerse yourself in the company culture, and get to know your team and key stakeholders.
- Build relationships with data engineering team members, understand the day to day operating model, and stakeholders that we interact with on a daily basis.
- Start to learn about our data platform infrastructure, data pipelines, source systems, and inter-dependencies.
- Start participating in standups, planning, and retrospective meetings.
- Start delivering on assigned sprint stories and show progress through completed tasks that contribute to team goals.
- Within 3 Months: Start delivering on assigned data engineering tasks to support our day to day, and roadmap.
- Start troubleshooting production issues, and participating in on-call activities.
- Identify areas for improving data engineering processes, and share with the team.
- Within 6 Months: Contribute consistently towards building our data platform, which includes data pipelines, and data warehouse layers.
- Start to independently own workstreams whether it is periodic data engineering activities, or work items in support of our roadmap.
- Deepen your understanding, and build subject matter expertise of our data & ecosystem.
- Within 12 Months: Your contributions have led to us making significant progress in implementing the data platform strategy, and key data initiatives to support the companys growth.
- Youve established yourself as a key team member with subject matter expertise within data engineering.
Caderno Nacional