Introhive is looking for an experienced Data Architect to be part of our data science team. This individual will be responsible for the data modelling to support the deployments of Machine learning models on Snowflake. This candidate will work directly with internal teams to ensure that data modelling meets product development objectives. This person must be a self-starter and is comfortable working in a fast-paced, fun, energetic environment and comfortable with dealing with ambiguity. This person must also have a passion to solve unique data management problems. The Data Architect will be responsible for providing technical expertise in traditional data management while working with modern data ecosystems to ensure data management is successful. This person will have a broad range of skills and experience ranging from data architecture to ELT, security, performance analysis, analytics, etc. He/she/they will have the insight to make the connection between data science-specific business problems and data architecture requirements and enhancements. This individual should have the technical skills to be able to not only build data models and create DDL but also to provide consultative assistance on data architecture and implementation to the broader organization. The person we’re looking for shares our passion for reinventing the data platform and thrives in the dynamic environment. That means having the flexibility and willingness to jump in and get done what needs to be done to make our modern data architecture successful. It means keeping up to date on the rapidly evolving techniques for modern data management as well as Machine learning feature store design.
What You’ll Do:
- Deploy data architecture best practices, including ensuring knowledge transfer so that data scientists can conduct ad-hoc data science experiments with minimum data movement activities.
- Provide guidance on how to develop and maintain Slowly Changing Dimension frameworks.
- Support other members of the data science team to develop their expertise in data management.
- Collaborate with Product, and Engineering to continuously improve data management.
What You’ve Accomplished:
- University degree in computer science, computer information systems, engineering, mathematics, or related fields.
- 10+ total years in data architecture, analytics, design, and development.
- Data Management experience including Data Governance, Master Data Management, Reference Data Management, Metadata Management, and Data Security.
- Understanding of a modern data analytics stack and workflow, from ELT to data platform design to analytical service layer design.
- Strong skills in databases, data warehouses, and data processing.
- Extensive hands-on expertise with SQL and SQL analytics in snowflake.
- Extensive knowledge of and experience with data modelling within a columnar data store.
- Software development experience with Python, Ruby, SQL.
- Enhance and build on our data management strategy.
- Knowledge of Postgres(RDS) and Change Data capture methods.
- Should be able to perform data-related tasks and analysis in Snowflake.
Great if you’ve been exposed to or have:
- Snowpro, and or Snowflake master certification is preferred.
- Experience with modelling concepts that are appropriate for non-relational platforms.
- Experience implementing ELT pipelines using DBT.
- Understand modern approaches to automating data pipelines with airflow.
- Experience using cloud services such as EKS, Faregate, S3, Kinesis, Snowpipe, MWAA, DMS, Great expectations, DBT, etc.
- Experience working within a product-focused development environment.
- Experience in deploying fully operational, production-ready data solutions on Snowflake.
- Good SQL and performance tuning experience in Snowflake.
- Experience with DVC, CML, and feature store ingest methods.
- Get excited at the opportunity to work on an innovative suite of software products.
- Get unique, hands-on experience with bleeding-edge data management architectures.