Job Description
This role focuses on building and maintaining robust data infrastructure and analytics frameworks to support business growth and user data expansion. The candidate will be responsible for auditing existing systems, designing scalable solutions, and ensuring seamless data integration across platforms. Key responsibilities include working closely with cross-functional teams to align technical implementations with business objectives, while continuously improving data quality, workflow reliability, and system scalability. The position requires a deep understanding of data architecture principles and the ability to translate business needs into technical strategies that drive operational efficiency and innovation.
Key Responsibilities
- Conduct a comprehensive audit of the existing data infrastructure and availability within Cloud Firestore to identify inefficiencies and opportunities for optimization.
- Specify and implement a new data warehouse solution, such as Snowflake, to replace or enhance the current infrastructure, ensuring scalability, performance, and cost-effectiveness.
- Collaborate with the Product team to develop a scalable data infrastructure and analytics framework that can handle increasing user data volumes and evolving business needs.
- Partner with engineering teams to design and deploy data pipelines that enable efficient data ingestion, transformation, and querying across multiple sources.
- Continuously monitor and improve data quality, workflow reliability, and system scalability by exploring emerging technologies and refining existing processes.
- Provide performance reports and insights to stakeholders, demonstrating the capabilities and impact of data infrastructure improvements.
- Own the data lifecycle for new products and features, including defining data sources, designing warehouse architectures, and establishing reporting mechanisms.
- Ensure data systems are aligned with organizational goals by balancing technical requirements with business priorities.
- Lead the evaluation of data governance frameworks and ensure compliance with regulatory standards and internal policies.
- Develop and maintain documentation for data systems, including architecture diagrams, process flows, and technical specifications.
Job Requirements
- Proven experience in designing and managing cloud-based data infrastructure, with a strong background in Cloud Firestore and Snowflake technologies.
- Advanced knowledge of data warehousing concepts, ETL processes, and data pipeline architecture to support scalable analytics solutions.
- Ability to collaborate effectively with Product and Engineering teams to translate business requirements into technical specifications.
- Strong analytical skills to evaluate data systems, identify bottlenecks, and propose data-driven improvements.
- Experience with data quality assurance, workflow automation, and performance optimization techniques.
- Excellent communication skills to convey technical insights to non-technical stakeholders and document processes clearly.
- Proficiency in SQL, cloud computing platforms, and data visualization tools to support data analysis and reporting.
- Ability to manage multiple projects simultaneously while maintaining attention to detail and meeting deadlines.
- Experience with agile methodologies and a track record of delivering scalable data solutions in fast-paced environments.
- Knowledge of data governance principles and security best practices to ensure compliance and data integrity.
- Strong problem-solving abilities and a proactive approach to identifying and resolving technical challenges in data systems.
- Experience with cloud-native technologies and containerization tools (e.g., Docker, Kubernetes) for deploying and managing data infrastructure.
- Ability to work independently and take ownership of complex data projects from conceptualization to deployment.
- Excellent time management skills and the capacity to prioritize tasks effectively in a dynamic work environment.
- Proficiency in scripting languages (e.g., Python, Bash) for automating data processing tasks and system maintenance.