Jobs /

Data Engineer with Big data

GlobalLogic

Apply Now

Job Details

Location: Austin, Travis County, Texas, USA Posted: Jun 04, 2020

Job Description

* 3-5 years experience building large-scale distributed systems
* Experience in designing and developing ETL data pipelines. Should be proficient in writing Advanced SQLs, Expertise in performance tuning of SQLs
* We are looking for experience in designing and building dimensional data models to improve accessibility, efficiency, and quality of data
* Database development experience with Relational or MPP/distributed systems such as Oracle/Teradata/Vertica/Hadoop
* Knowledge with at least two of the following: Spark, MapReduce, HDFS, Cassandra, Kafka
* We are seeking programming experience in building high quality software in Java, Python or Scala preferred
* You will demonstrate excellent understanding of development processes and agile methodologies
* Strong analytical and interpersonal skills
* Enthusiastic, highly motivated and ability to learn quick
* Experience with or advance courses on data science and machine learning is ideal
Work/project experience with Big Data and advanced programming languages is a plus
* Experience developing Big Data/Hadoop applications using java, Spark, Hive, Oozie, Kafka, and Map Reduce is a huge plus
* Exceptional analytical and programming skills

About GlobalLogic

GlobalLogic is a full-lifecycle product development services leader.

View Website

Get More Interviews for This and Many Other Jobs

Huntr helps you instantly craft tailored resumes and cover letters, fill out application forms with a single click, effortlessly keep your job hunt organized, and much more.

Sign Up for Free