Fusing AWS Cloud & Data Architecture with Blockchain Transactions


Sindri, a blockchain analytics startup looking for patterns among Ethereum transactions, needed help optimizing their data ingestion and data pipeline on AWS to drive accurate, comprehensive, and timely insights on market activity. This included adding comprehensive monitoring on top of enhancing their streaming data processing, planning for a performant data warehouse to back their analytics platform for consumers, and providing architectural advice on a wide range of topics from CI/CD pipelines to secure parametrization of databases and critical configuration settings supporting multiple pre-production development environments.


To support this small team with a limited budget, evolv within a 12-day engagement handled implementation of several business-critical processes for Sindri while creating a roadmap to unlock long-term success in a scalable and cost-effective manner with their AWS infrastructure. In the face of adverse events requiring disaster recovery, evolv helped the team rebuild their data and processes with focus on security.

  • Worked closely with Sindri’s engineers and key stakeholders to define the data sources, design the architecture, security structure, and the development of the cloud environment and enterprise data warehouse (EDW)
  • Executed on optimizing data ingestion for speed and security, building business-critical monitoring, and assisting with system outages that compromised database access
  • Provided documentation and engaged in discussions with the team regarding further optimizations to their development processes from business-critical ETL to security, disaster recovery, and CI/CD


  • Optimized data ingestion processes to take advantage of low-cost, massively parallel compute resources so that data ingestion stays synchronized up-to-the-second with the Ethereum blockchain, and runs concurrently with jobs backfilling historical data
  • Built a task to monitor for system outages, gaps in data, or failures to see fresh data. Now engineers are alerted to data ingestion problems by a Discord bot
  • Made a more secure database, with roles for specific users and applications, that is optimized for high write performance by considering certain key aspects of source data in its configuration parameters
  • System and application parameters are stored securely, and adaptable to the application’s execution environment (e.g. QA vs. production)
  • Resources are now secured by access roles and security groups, locking out bad actors
  • Sindri is fully empowered to own their entire data ingestion and ETL from end to end, including cost analysis as they grow, how to structure the data lake for best price, and experiments they should run as they look to optimize query performance for analysts
  • Sindri has a concrete action plan for building out CI/CD pipelines for streamlined application deployment with an eye towards building infrastructure as code
  • Event-driven architectures are being leveraged to instantly process new data