Technopals Pte Ltd

BIG DATA ENGINEER

Job description

Job Scope

Job Description & Requirements
  • Design and implement relevant data models in the form of data marts stored in Operational Data Stores, Data Warehouses or Big Data platforms
  • Build data pipelines to bring information from source systems, harmonise and cleanse data to support analytics initiatives for core business metrics and performance trends.
  • Perform data profiling to understand data quality and advise practical measures to address such data issues through data transformation and data loading
  • Dive into company data to identify sources and features that will drive business objectives.
  • Work closely with project manager and technical leads to provide regular status reporting and support them to refine issues/problem statements and propose/evaluate relevant analytics solutions
  • Bring your experience and ideas to effective and innovative engineering, design, and strategy
  • Work in interdisciplinary teams that combine technical, business and data science competencies that deliver work in waterfall or agile software development lifecycle methodologies
  • The range of accountability, responsibility and autonomy will depend on your experience and seniority, including:Contributing to our internal networks and special interest groups Mentoring to upskill peers and juniors
Job Requirements
  • Diploma / Degree in Computer Science / Computer Engineering / Information Technology related field, or IT equivalent.
  • Minimum of 3 years’ experience in building large scale enterprise data pipelines using commercial and/or open-source data management tools from vendors such as Informatica, Talend, Microsoft, IBM or Oracle
  • Strong knowledge of data manipulation languages such as SQL necessary to build and maintain complex queries and data pipelines
  • Practical appreciation of data q/quality metrics and remediation strategies
  • Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP
  • Undergraduate or graduate degree in Computer science or equivalent
  • Possess good communications skills to understand our customers' core business objectives and build end-to-end data centric solutions to address them
  • Good critical thinking and problem-solving abilities
Good To Have
  • Experience with other aspects of data management such as data governance, metadata management, archival, data lifecycle management
  • Processing of semi-structured and unstructured data sets such as NoSQL, graph and Hadoop based data storage technologies such as MongoDB, Cassandra, HBase, Hortonworks/Cloudera, Elastic Search and Neo4j using Spark, Splunk or Apache Nifi for batch or streaming data
  • Large scale data loading experience moving enterprise or operational data from source systems to new applications or data analytics solutions
  • Experience in leveraging on cloud-based data analytics platform such as:
  • AWS serverless architecture in Lambda on AWS DynamoDB, EMR Redshift
  • Azure Data Factory or SQL Data Warehouse
  • GCP BigQuery/BigTable, Cloud Dataprep/Dataflow/Dataproc

Please let the company know that you found this position on this Job Board as a way to support us, so we can keep posting cool jobs.

Similar jobs

Browse All Jobs
Caylent
March 24, 2023

Big Data Engineer

Sopra Steria
March 24, 2023