Data Engineer

Job at Edge & Node

$70k-150k

Remote

Full time

Edge & Node is a creative software development company working to build a vibrant, decentralized future. Founded by the initial team behind The Graph, Edge & Node is dedicated to the advancement of web3, a decentralized and fair internet where public data is available to all—an internet that enables its users to increase agency over their creations and their lives.

Edge & Node’s initial product is The Graph, an indexing protocol for querying networks like Ethereum and IPFS, which ensures open data is always available and easy to access. The Graph is used by thousands of protocols and dapps including Uniswap, Livepeer, Aave, Decentraland, and more. Edge & Node also launched Everest, a decentralized registry with the mission to catalyze the shift to web3, facilitating community-driven curation of projects providing ongoing utility to the crypto space.

The Engineering Operations & Customer Success team works closely with all other Engineering teams across Edge & Node to ensure the services we operate are reliable, performant, secure, and predictable. We focus on a mix of software development, operational automation, cyber security, and collaboration with other teams to help take our service delivery to the next level.

We are looking for an early-career Data Engineer to be focused on developing and maintaining data science pipelines. Ideally, the team would like to bring on someone who has experience with the current tools being used by the team which include, but are not limited to, Redpanda, Materialize, and GCP. In this role, you will monitor and maintain reliability of the Redpanda cluster, streaming database, DBT jobs, QoS oracle, and other data engineering systems. You’ll be expected to learn Materialize and help migrate BigQuery models to reduce costs. In addition, you will help establish and maintain good standards around documentation and internal educational tools and respond to data engineering/devops requests in our incident management process.

What You’ll Be Doing

  • Learning our infrastructure and data engineering toolset
  • Partnering closely with our Data Science team to perform various data warehouse jobs and periodic RedPanda/streaming database devops tasks
  • Manage historical data models in BigQuery/DBT
  • Develop pipelines to support dashboards and perform devops tasks to support Superset dashboards

What We Expect

  • Experience with one or more of the following: BigQuery, ETL automation/workflow tools (DBT), BI/dashboarding tools (Apache Superset), streaming data platforms (Apache Kafka, Redpanda, or Confluent), or other data engineering and data warehouse toolsets/environments
  • Some experience or knowledge of container orchestration tools such as Kubernetes and Kustomize preferred
  • Some experience or knowledge of monitoring and alerting (Grafana dashboards) preferred
  • Some experience or knowledge of SQL–able to create and manage tables within a SQL database
  • Proficiency in one or more programming languages, such as Python, R, or Rust
  • Must be able to to serve on-call shifts and support devops needs
  • Ability to create documentation and communicate with a a variety of audiences
  • Clear communication skills (written and verbal) to document processes and architectures
  • Ability to work well within a multinational team environment
  • Preference to be physically located in The Americas, however the team is open to candidates in European time zones or other locations

Company: Edge & Node

Skills: data science dev crypto ethereum kubernetes

Please support us by letting Edge & Node know that you found the job on Aworker. Thank you🙏

Receive

web3 jobs

Join 0+ people getting web3 jobs in their inbox