Rol: Data Engineer
100%Remote
Advanced English
100 Payroll
Min 6-8 years of BIG data development experience.
*Migración de tablas al Data Lake utilizadas por el área de Servicios Financieros (SF) y Prevención de Fraudes (PF)*
Demonstrates up-to-date expertise in Data Engineering, complex data pipeline
development.
Experience in agile models
Design, develop, implement and tune large-scale distributed systems and pipelines that
process large volume of data; focusing on scalability, low -latency, and fault-tolerance in
every system built.
Experience with Java, Python to write data pipelines and data processing layers
Experience in Airflow & Github.
English conversational
Experience in writing map-reduce jobs.
Demonstrates expertise in writing complex, highly-optimized queries across large data sets
Proven, working expertise with Big Data Technologies Hadoop, Hive, Kafka, Presto, Spark,
HBase.
Highly Proficient in SQL.
Experience with Cloud Technologies ( GCP, Azure)
Experience with relational model, memory data stores desirable ( Oracle, Cassandra,
Druid)
Provides and supports the implementation and operations of the data pipelines and
analytical solutions
Performance tuning experience of systems working with large data sets
Experience in REST API data service – Data Consumption
Retail experience is a huge plus.