Junior Data Engineer
Who You Are:
You are a curious, driven, and detail-oriented individual who is passionate about working with data and solving operational challenges. You thrive in fast-paced environments, enjoy troubleshooting, and have a keen interest in building and optimizing data pipelines. You are looking for a hands-on role where you can learn and grow as a Data Engineer while contributing directly to the success of a growing SaaS company. You’re comfortable managing daily operations and have a solid foundation in data engineering tools such as SQL, Python, Data Visualization, ETL and Cloud solutions.
The Opportunity:
As a Junior Data Engineer, you will play a key role in supporting daily operations across various platforms, including Alteryx, Tableau, Postgres, and AWS. You will monitor data processes, troubleshoot issues, and help improve and automate operational workflows. This role provides the opportunity to work closely with the founding team and engineers, giving you direct exposure to the inner workings of a data-driven startup. You’ll gain valuable experience in managing large-scale data infrastructure and building scalable ETL pipelines, all while helping drive product development at YDP.
Who We Are:
At Your Data Playbook (YDP), we believe data is the most valuable asset of the 21st century. Our mission is to unlock this potential and deliver meaningful growth for eCommerce entrepreneurs worldwide.
We empower entrepreneurs by transforming their data into actionable intelligence, enabling them to identify and prioritize the most impactful opportunities for growth.
Join us in creating a future where harnessing data fuels success for every entrepreneur.
What You'll Do:
- Monitor and manage daily operations: Keep a close eye on our operational processes, ensuring all workflows run smoothly.
- Troubleshoot and resolve issues: Handle common errors like workflow failures, server memory or disk space issues, connection interruptions, and manual recovery processes.
- Support data infrastructure: Work alongside the founding team and engineers to design, develop, and manage scalable data pipelines and ETL processes.
- Automate operational workflows: Help enhance and automate key processes such as data validation checks, member data consistency, and ensuring that datasets are complete in both local and cloud environments.
- Optimize server performance: Improve AWS disk and RAM utilization monitoring, enhancing server status alerts and troubleshooting performance bottlenecks.
- Licensing management: Oversee Alteryx and Tableau licensing to ensure compliance and renewals.
- Collaborate with product teams: Contribute to the development of YDP’s data optimization platform by working closely with engineers and data scientists to build, maintain, and enhance data pipelines, API integrations, and web scraping systems.
- Contribute to product growth: As part of a small, high-performing team, you’ll help influence product development by providing valuable insights into data processes, infrastructure optimization, and operational improvements.
Must-Haves in a Candidate:
- 1+ years of hands-on experience with data engineering tools such as SQL, Python, Alteryx, PostgreSQL, AWS, Tableau or related technologies.
- 1+ years of hands-on experience building and optimizing scalable ETL pipelines using tools like Alteryx, Informatica, dbt or equivalent.
- Experience with Python for scripting and automation.
- Experience with relational databases like PostgreSQL, including troubleshooting and optimizing database performance.
- Experience troubleshooting operational errors: Familiarity with common issues such as server outages, disk space management, and workflow failures.
- Strong communication skills: Ability to effectively manage expectations and communicate operational updates with internal teams.
- Attention to detail: You enjoy creating and maintaining accurate data pipelines and performing data validation checks.
- Proactive problem-solving: Ability to handle operational disruptions and recovery processes, with a mindset for continuous improvement.
Nice to Have in a Candidate:
- Familiarity with cloud platforms: AWS, Google Cloud, or equivalent.
- Ability to create data visualizations: Using Tableau or other visualization tools.
- Interest in web scraping and API integration: Building and maintaining systems that collect data.
- Basic knowledge of HTML and CSS
- Experience with Docker, Terraform, or similar technologies.
Details:
- Job Type: Full-time.
- Location:Â US, Panama, Remote.Â
- Salary: Competitive and dependent on experience.
- Benefits: Opportunity for growth and the ability to shape the future of the company.