Our Client is a techcentric hedge fund and they are seeking an enthusiastic highly focused Data Engineer to work closely with almost every team within the firm. The Technology team pride themselves on using the best tools and modern tech to achieve their goals.
You will be responsible for maintaining vast amounts of data ingestion processes of diverse complexity while contributing to investment decisions and critical operations. This role is extremely fast paced with various complexities but will allow you to push boundaries while constantly learning and discovering new creative ways to execute any task.
Responsibilities:
- You will create supportable data ingestion pipelines, systems and platforms.
- Flagging issues and features while communicating them to others concisely and clearly; exploring and digging into data.
- develop and standardise ingestion methodologies and include/promote those standards to other teams across the business by using tools and libraries.
- Maintaining highly effective systems for downstream teams while keeping them scalable, reliable, and flexible long term as the company changes and expands.
- Ensure the improvement, support and monitoring of existing ingestion and system pipelines
Requirements:
- Extensive tech knowledge, specifically in regards to the exploration and processing of data and in interest in learning about new technologies.
- A passion for continual improvement and automation, with a proven track record of identifying high value automation opportunities.
- The ability to establish standards that shorten development time and increase reliability while identifying patterns.
- Collaborating with peers across all teams global and build positive relationships.
- debugging and problem solve using a systematic and methodical approach.
The ideal candidate will have:
- Coding experience in Go / Python / Java / Scala / C# or equivalent.
- History with working with various manipulation and data storage tools including SQL, Pandas, Elasticsearch & Kibana, Snowflake.
- Experience with orchestration and containerisation tech such as Flux / Docker / Helm / Kubernetes.
- Previous knowledge with multiple ELT/ETL tech such as Hive / Airflow / Argo / Spark / Dagster.
Beneficial skills and experience:
- Experience with data visualisation tools.
- Apache Kafka or similar stream processing platforms and concept experience.
- Cross asset financial markets experience such as Equities, FX, Options, Futures, Fixed Income