Must have 6+ years of 'Recent' experience working for a Major Bank or Brokerage house in the US.
Must 12 yrs+ of experience maintaining applications utilizing Java, J2EE, SDLC, and WebSphere.
Must have last 6 years experience working with CASSANDRA, Hadoop, MongoDB, Apache Spark, HDFS, YARN, MapReduce, PIG&HIVE, Flume & Scoop, and Zookeeper.
Must have 6 years of experience maintaining Tier-1 data driven apps.
Must have experience with 24/7 uptime and strict SLA.
Extensive experience maintaining data pipelines, aggregate & transform raw data coming from a variety of data sources.
Extensive experience optimizing data delivery and helping redesign to improve performance, handling, transformation, and managing BigData using BigData Frameworks.
Extensive experience maintaining processed data parallel on top of distributed Hadoop storage using MapReduce.
Must have experience wit SOA -Design principles.
Must have 5+ years programming in Scala, Java, Python or GO
Must have 5+ years developing on Hadoop/Spark.
Must have 6+ years developing on an RDBMS such as Microsoft SQL Server, and PostgreSQL.
Must have experience with large data sets regularly transforming and querying tables or sets of greater than 20 million records
Exposure to data hygiene routines and models
Experience in database maintenance.
Ability to identify problems, and effectively communicate solutions to team.
The Job
The consultants will be 'Big Data Engineers' helping, support, maintain and test the New Database for specific business units. These positions will be responsible for maintaining complex Databases for the Business Technology Group.
The consultants will work with minimal supervision and guidance from more seasoned consultants, and may also be expected to provide application and Database support