Sr. Data Engineer


Dornan Technologies

2020-11-21 04:53:07

Job location Santa Clara, California, United States

Job type: fulltime

Job industry: I.T. & Communications

Job description

We are looking for a Sr. Data Engineer to join Long-term Contract position for our direct end client at Santa Clara, CA.


U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are able to sponsor at this time.


Job Title: Data Engineer SR.


Job Type: Long term Contract


Location: Santa Clara, CA


Mode of Interview: Telephone/Web Call.


Main Duties/Responsibilities :


  1. Create and maintain optimal data and model dataOps pipeline architecture.
  2. Assemble large, complex data sets that meet functional / non-functional business requirements.
  3. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  4. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and cloud-based big data technologies from AWS, Azure and others.
  5. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  6. Keep data separated and secure across national boundaries through multiple data centers and strategic customers/partners.
  7. Create tool-chains for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  8. Work with data and machine learning experts to strive for greater functionality in our data and model life cycle management systems.
  9. Support dataOps competence build-up in Ericsson Businesses and Customer Serving Units.

Skills & Experience:


5BS, MS or PhD degree in Computer Science, Informatics, Information Systems or another related field.


  1. 3-5 years experience using the following software/tools:
  2. Hadoop, Spark, Kafka, etc.
  3. Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.

  4. Experience with Data and Model pipeline and workflow management tools: Azkaban, Luigi, Airflow, Dataiku, etc.

  5. Experience with stream-processing systems: Storm, Spark-Streaming, etc.
  6. Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

Additional Information:


If you are interested, please respond to this posting with an updated profile and a summary of the technical & personal skills. We look forward to hearing from you soon.


All your information will be kept confidential according to EEO guidelines.

- provided by Dice

Inform a friend!

Top