Data Engineer

Job description

Company

WEBB Traders is a fast growing global proprietary trading company and market maker active in electronic trading on leading international financial markets, driven by data and technology. We are entrepreneurial and innovative - embracing teamwork and creativity. Our headquarters is in Amsterdam City Centre and is supported by a broader European and Asian presence from where the regional and US markets are traded.

 

WEBB is searching for a data engineer in Amsterdam.

 

Role

WEBB Traders is looking for a data engineer to join our growing R&D team in Amsterdam. The young and dynamic team is responsible for the build of our automated trading solutions, tools, data processing and connectivity. You will be challenged and exposed to financial markets data, trading intelligence, exchange/ market connectivity, reporting and risk requirements and quantitative and big data analysis.

 

We focus on continuous improvement and technological innovation. The data tools we use are MSSQL, MongoDB, GCP, QlikSense and others. Programming languages used are C++, C# and Python. As a data engineer you will be a member of the development team and responsible for all data retrieval, capturing, storing, cleaning, normalization, and distribution of data needed by our stakeholders. The countless trading decisions made daily are mainly data driven. It is inevitable to work closely to traders and deliver fast accurate and complete data solutions.


Do you like to work in an entrepreneurial organization and to become an expert in managing trading and financial markets data? Then we are looking for you!


You are going to build and improve high performance data solutions and workflows for trading, risk management, operations, and the quants team. To keep up with the fast changing market developments, you will spend time researching new technologies and inventing new features. You create data driven insights for data capture, analysis, automation and distribution.


Your work will consist mainly of:

  • Development and optimization of data retrieval and cleaning process
  • Control, check, validate, monitor, maintain and distribute data to internal stakeholders
  • Develop, extend and maintain data tools based in Python, PySpark and SQL
  • Master and optimize the databases of the company
  • Data driven automation, data capture and analysis
  • Help with adoption of best practices and creating awareness
  • Control the spend on market data and data processing cost

Our offer

  • The opportunity to join a highly professional team in a very challenging market
  • Be part of a fast growing and ambitious company
  • A competitive reward package
  • Facilities to develop your skills and competences
  • Healthy lunch at the office
  • Free sport facility
  • Travel costs reimbursement
  • Fun activities with colleagues on regularly base

Requirements

Skills required:

  • Bachelor or Master degree in computer science, data science, computer engineering or a related field of study
  • 1 to 2 years experience in a data engineering or data management role, working with Google Cloud Platform, building PySpark and Python based tools
  • Proven track record with cleaning, transforming and maintenance methods and procedures for effective data management
  • Solid data science fundamentals, preferably with experience in machine learning and artificial intelligence.
  • Understanding of networking and databases
  • Familiarity with version control (git), unit testing, code coverage & code reviewing
  • Excellent communication skills to work closely with traders, quants and IT engineers
  • Ability to adapt and work in a fast-paced environment

Preferably complemented with: 

  • A true interest in finance and trading
  • The personality to challenge and drive the company towards the next level of data maturity
  • Experience with financial market data, historical data and tick data
  • Experience with Natural Language Processing / textual data
  • Understanding of C/C++ Language
  • Understanding of Agile/Scrum methodologies
  • Experience with DevOps tools and continuous integration and continuous deployment
  • Experience with Atlassian suite (Confluence, JIRA, JIRA Service Desk, BitBucket)