Data Engineer

As our industry has evolved, the way our clients consume data has changed. RBC Capital Market’s application teams face challenges of large-scale data storage, low-latency retrievals, high-volume requests and high availability over a distributed environment. We create standardized solutions to these problems by building core services and technology frameworks for enterprise-wide use.

This data and analytical platform serve the needs of various lines of businesses spanning trading, risk, research, sales, regulation, and compliance. The mission of the data fabric team is to make the complex simple and to provide a scalable and highly available data processing platform.

As an experienced L3 Engineer, you will help us achieve operational excellence in our big data query infrastructure and accelerate / streamline adoption of various large-scale programs aligned with data consumption strategy.

What will you do?

  • Analyze medium to high complexity queries for performance optimization.
  • Update queries, run and analyze query plans to incrementally improve join performance, aggregation performance and inclusion of relevant predicates.
  • Identify resource requirements for query execution and work with end-users and platform support teams to provide adequate capacity.
  • Review query execution reports on a daily, monthly, quarterly basis to provide an assessment of capacity needs and optimization opportunities.
  • Provide L3 support to end-users.
  • Provide timely feedback and status updates to Project Management.
  • Perform knowledge sharing, conduction education workshops, and train other employees are expected
  • Keep pace with emerging technology by researching and evaluating products


  • BS in Computer Science / Math/ engineering and including 5+ years experience working with data systems that handle large data volumes for analytical workloads.
  • 1+ Years Experience with Spark, Dremio or Presto.
  • Expertise with Hadoop and exposure to at least one of Oracle, SQL Server, Postgres, AWS S3, Azure ADLS
  • Exposure to Kubernetes, Docker, AWS/Azure
  • Prior experience as a DBA preferred
  • Should be able to write shell scripts and work in Linux environment; proficient understanding of distributed computing principles

What’s in it for you?

We thrive on the challenge to be our best, progressive thinking to keep growing and working together to deliver trusted advice to help our clients thrive and communities prosper. We care about each other, reaching our potential, making a difference to our communities, and achieving success that is mutual.

  • Leaders who support your development through coaching and managing opportunities
  • Ability to make a difference and lasting impact
  • Work in a dynamic, collaborative, progressive, and high-performing team at a world-class training program in financial services
  • Flexible work/life balance options
  • Opportunities to do challenging work
  • Opportunities to take on progressively greater accountabilities
  • Opportunities to building close relationships with key stakeholders
  • Access to a variety of job opportunities across businesses and geographies via internal mobility