Job Description:
Lambda is the DeFi Intelligence Platform.
Design, maintain, and scale streaming ETL pipelines for blockchain data.
Build and optimize ClickHouse data models and materialized views for high-performance analytics.
Develop and maintain data exporters using orchestration tools.
Implement data transformations and decoding logic.
Combine multiple data sources — indexers and Kafka topics from third parties — to aggregate them into tables for our API.
Establish and improve testing, monitoring, automation, and migration processes for pipelines.
Ensure timely delivery of new data features in alignment with product goals.
Create automation tools for data analyst inputs, such as a dictionary, to keep them up to date.
Job Requirements:
Strong SQL skills with columnar databases (ClickHouse, Druid, BigQuery, etc.).
Hands-on streaming frameworks experience (Flink, Kafka, or similar).
Solid Python skills for data engineering and backend services.
Proven track record of delivering pipelines and features to production on schedule.
Strong focus on automation, reliability, maintainability, and documentation.
Startup mindset: keep balance between speed and quality.
Nice to Have
Experience operating ClickHouse at scale (performance tuning, partitioning, materialized views)
Experience with CI/CD and automated testing for data pipelines (e.g. GitHub Actions, dbt)
Knowledge of multi-chain ecosystems (EVM & non-EVM)
Familiarity with blockchain/crypto data structures (transactions, logs, ABI decoding).
Contributions to open-source or blockchain data infrastructure projects
Benefits:
Fully remote
Full-time contractor (Indefinite-term Consultancy Agreement)
Competitive salary level in $ (we can also pay in crypto)
Paid vacation and sick leave
Well-being program
Mental Health care program
Compensation for education, including foreign language & professional growth courses
Equipment & co-working reimbursement program
Overseas conferences, community immersion


