Site nethermindeth Nethermind

What are we all about?

We are a team of world class builders and researchers with expertise across several domains: Ethereum Protocol Engineering, Layer-2, Decentralized Finance (DeFi), Miner Extractable Value (MEV), Smart Contract Development, Security Auditing and Formal Verification.

Working to solve some of the most challenging problems in the blockchain space, we frequently collaborate with renowned companies, such as Ethereum Foundation, StarkWare, Gnosis Chain, Aave, Flashbots, xDai, Open Zeppelin, Forta Protocol, Energy Web, POA Network and many more.

We actively contribute to Ethereum core development, EIP’s and network upgrades together with the Ethereum Foundation, and other client teams.

Today, there are nearly 200 of us working remotely from over 45+ countries.

As a Blockchain Data Engineer, you’ll be responsible for building and maintaining the data infrastructure for the DeFi projects Nethermind is involved in. You’ll write and maintain ETLs and their orchestration in order to build meaningful products and APIs.

The role:

  • Building bespoke data infrastructure for various projects.
  • Crawl and ingest data from various blockchains.
  • Create scalable systems to solve different problems using modern cloud technology and industry best practices.
  • Design database schema that is performant, scalable, and maintainable
  • Assist with data analysis queries, carrying out data analysis on the infrastructure.

You have:

  • Experience extracting on-chain data and good understanding of blockchain concepts.
  • Experience in greenfield data engineering projects, specifically in data infrastructure projects.
  • Advanced knowledge of modern data pipeline architecture and cloud platforms, e.g. AWS/GCP/Azure.
  • Proven success in communicating with users, other technical teams, and senior management to collect requirements, and describe data modeling decisions and data engineering strategies.
  • Hands-on design experience with data pipelines, joining data between structured and unstructured data.
  • Knowledge of data pipeline tools (e.g. Snowflake/Redshift/BigQuery, AWS lambda/S3, Apache Airflow, Spark/Hadoop etc.).
  • Comfort with one or more: Python/Scala/Java/Golang
  • Comfort writing SQL queries.

Nice to have:

  • Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, and build processes.
  • Comfort with infrastructure as code, Terraform/Terragrunt
  • Strong understanding of distributed systems, Restful APIs and hands-on experience with JSON-RPC endpoints.
  • Strong communication skills and ability to run analysis independently

Perks and benefits:

  • Fully remote
  • Flexible working hours
  • Plus equity

Join us!

We are always on the lookout for talent!

Pour postuler à cette offre d’emploi veuillez visiter boards.eu.greenhouse.io.