Senior Manager, Data Engineering


Capital One

2018-10-08 16:46:36

Job location Mc Lean, Virginia, United States

Job type: fulltime

Job industry: Engineering

Job description

McLean 1 (19050), United States of America, McLean, Virginia

At Capital One, we're building a leading information-based technology company. Still founder-led by Chairman and Chief Executive Officer Richard Fairbank, Capital One is on a mission to help our customers succeed by bringing ingenuity, simplicity, and humanity to banking. We measure our efforts by the success our customers enjoy and the advocacy they exhibit. We are succeeding because they are succeeding.

Guided by our shared values, we thrive in an environment where collaboration and openness are valued. We believe that innovation is powered by perspective and that teamwork and respect for each other lead to superior results. We elevate each other and obsess about doing the right thing. Our associates serve with humility and a deep respect for their responsibility in helping our customers achieve their goals and realize their dreams. Together, we are on a quest to change banking for good.

Senior Manager, Data Engineering

As a Capital One Senior Manager, Data Engineering, you'll be part of an Agile team dedicated to breaking the norm and pushing the limits of continuous improvement and innovation. You will participate in detailed technical design, development and implementation of applications using existing and emerging technology platforms. Working within an Agile environment, you will provide input into architectural design decisions, develop code to meet story acceptance criteria, and ensure that the applications we build are always available to our customers. You'll have the opportunity to mentor other engineers and develop your technical knowledge and skills to keep your mind and our business on the cutting edge of technology. At Capital One, we have seas of big data and rivers of fast data.

Who You Are:

  • You yearn to be part of cutting edge, high profile projects and are motivated by delivering world-class solutions on an aggressive schedule
  • Someone who is not intimidated by challenges; thrives even under pressure; is passionate about their craft; and hyper focused on delivering exceptional results
  • You love to learn new technologies and mentor junior engineers to raise the bar on your team
  • It would be awesome if you have a robust portfolio on Github and/or open source contributions you are proud to share
  • Passionate about intuitive and engaging user interfaces, as well as new/emerging concepts and techniques.

The Job:

  • Collaborating as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation Big Data & Fast Data applications
  • Building efficient storage for structured and unstructured data
  • Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Apex, Flink, Nifi, Storm and Kafka on AWS Cloud
  • Utilizing programming languages like Java, Scala, Python and Open Source RDBMS and NoSQL databases and Cloud based data warehousing services such as Redshift
  • Utilizing Hadoop modules such as YARN & MapReduce, and related Apache projects such as Hive, Hbase, Pig, and Cassandra
  • Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Maven, Nexus, Chef, Terraform, Ruby, Git and Docker
  • Performing unit tests and conducting reviews with other team members to make sure your code is rigorously designed, elegantly coded, and effectively tuned for performance

Basic Qualifications:

  • Bachelor's Degree or military experience
  • At least 5 years of professional work experience in data warehousing or analytics
  • At least 4 years of experience in open source programming languages for data analysis
  • At least 3 years of Java development for data engineering
  • At least 3 years of data modeling development
  • At least 1 year of experience working with cloud data capabilities

Preferred Qualifications:

  • Master's Degree or PhD
  • 5+ years Java development experience
  • 4+ years of experience in Python, Scala, or R for large scale data analysis
  • 4+ years' experience with Relational Database Systems and SQL (PostgreSQL or Redshift)
  • 4+ years of UNIX/Linux experience
  • 2+ years of Agile engineering experience
  • 2+ years of experience with the Hadoop Stack
  • 2+ years of experience with Cloud computing (AWS)
  • 1+ years of experience with supervised machine learning
  • 1+ years of experience working with CAP Theorem and addressing complex non-functional requirements
  • 1+ years of experience in developing large and complex event processing platforms
  • 1+ years of experience in developing high volume transaction processing solutions that can be scaled for millions of daily transactions

What We Have:

  • Flexible work schedules
  • Convenient office locations
  • Generous salary and merit-based pay incentives
  • A startup mindset with the wallet of a top 10 bank
  • Monthly innovation challenges dedicated to test driving cutting edge technologies
  • Your choice of equipment (MacBook/PC, iPhone/Android Device)

Capital One will consider sponsoring a new qualified applicant for employment authorization for this position.

Inform a friend!

location

Top