Fulltime | DAYSHIFT | Makati (Hybrid)
About the Role
As a key member of our data team, you will design, implement, and manage web scraping operations and data pipelines within Snowflake environments. Your expertise in Python will be essential for developing scalable and efficient data processes that support our analytics and business intelligence goals.
You will play a crucial role in extracting and processing data from various online sources to drive our strategic decisions.
Why Cooee
Because we believe in the power of human connection. Because we are committed to flourishing human potential. Because we dream of a world where each one of us walks along the path to who we are and the best that we can be. This is Why we do What we do To be a part of transformation one person, one community, one business at a time.
We are One Team committed to investing in relationships fueled by trust and anchored on the One Shared Vision to transform through connection. We believe this is where the strength of Cooee and our partnerships lie in having clarity and conviction in purpose.
What you'll be working on
- Develop and maintain advanced web scrapers using Python to collect data from specified online sources.
- Design and manage data workflows and pipelines within our Snowflake data warehouse to ensure seamless data integration and accessibility.
- Uphold data quality and security, implementing robust measures to protect data integrity as it moves through extraction, transformation, and loading processes.
- Work closely with data analysts and business units to determine evolving data needs and refine data collection strategies to support business objectives.
- Monitor, optimize, and troubleshoot scraping scripts and Snowflake data pipelines to address any operational challenges.
- Stay at the forefront of developments in web scraping technologies, Python programming, and data warehousing practices.
- Document all data engineering processes, creating detailed guides for maintenance and compliance purposes.
What were looking for
- Bachelors degree in Computer Science, Engineering, or related field.
- Proven expertise in web scraping with Python, using libraries such as Scrapy, Beautiful Soup, or similar.
- Strong experience in managing and optimizing data pipelines in Snowflake.
- Proficient in SQL and familiar with other database technologies.
- Solid understanding of data structures, algorithms, and system design.
- Knowledgeable in data cleaning, transformation, and processing methodologies.
- Experience with Airflow
- Experience with cloud platforms, preferably in a Python development environment.
- Exceptional problem-solving skills and the capability to perform independently in complex project settings.
- Effective communication and collaboration skills.
Preferred Qualifications:
- Practical experience with machine learning frameworks and Python data libraries (e.g., Pandas, NumPy).
- Tensor Flow Experience
Job Type: Full-time, AM Shift (7AM- 4PM)
Work Setup:
- Hybrid (Makati Office) 4th Floor, Glass Tower 115 Carlos Palanca St., Legazpi Village, Makati City