Big Data Lead/Big Data Architect/Big Data Engineer

Information Technology
Bethesda (On-Site)
Mid-Senior

Refer a Friend

Job Description

ROLE:- Bigdata Lead (1st)


Location:- MD, Bethesda( remote with travelling can be given)


The Big Data lead will be responsible for building, maintaining data pipelines and data products to ingest, process large volume of structured / unstructured data from various sources. The Big Data lead will work on analyzing the data needs, migrating the data into an Enterprise data lake, build data products and reports. The role requires experience with building real time and batch based ETL pipelines with strong understanding of big data technologies and distributed processing frameworks with.


Experience (Total, Relevant)

8 to 10 years


Tech Skills:-

Hive, Snowflake, Nifi, Jenkins, Airflow, AWS, Unix , Kafka , EMR, Spark & Scala , Bigdata with Snowflake




ROLE:- Bigdata Architect (2nd)


Location:- MD, Bethesda( onsite)

Architect with Proven experience in Big Data, Cloud and Relational data store migration


Experience (Total, Relevant)

12 + years


Tech Skills:-

Bigdata Architect, Hive, Snowflake, Nifi, Jenkins, Airflow, AWS, Unix , Kafka , EMR, Spark & Scala , Bigdata with Snowflake


ROLE:- Big Data Engineer(3rd)

Location : New York Or Remote


Experience Level – 5 to 7 Years


Job Description:-

Strong software development skills and ability to multitask in an Agile environment

5+ years’ experience programming in one of the languages like Java( Is a must as our framework is in Java), Python.

3+ years’ experience with Kafka connects or equivalent.

Excellent communication skills and the ability to collaborate across teams, manage competing goals and changing priorities in a fast-paced environment

3+ years’ experience working with Big Data technologies such as HDFS, Hadoop, Spark, Hive, HBase, Druid, etc.

3+ years’ experience working with Apache Kafka or other message queuing systems Experience with Data pipelines at scale.

Experience working with/in containerized environments using Docker, Kubernetes, Swarm, Rancher, Mesos - Experience writing parsers/schemas for semi-structured and unstructured content - Experience with Elasticsearch, Solr, Cassandra and other database technologies


Skills

Big Data
By clicking ‘Submit application’ you consent to wwwwebologixcom processing your data and reach out to you using the data provided.
Powered By