Data Dashboards for Liquid Crypto

Disclaimer: This is not financial advice. Anything stated in this article is for informational purposes only, and should not be relied upon as a basis for investment decisions. Chris Keshian may maintain positions in any of the assets or projects discussed on this website.

To subscribe to my mailing list, input your email here.


This is a seven-part series on FJ Labs’ investment process for liquid crypto:

  1. Building FJ Labs Liquid Crypto

  2. Liquid Venture Capital

  3. Liquid Crypto Market Segmentation

  4. Identifying Significant Variables that Impact Prices

  5. Data Dashboards for Liquid Crypto

  6. Research Process for Liquid Crypto

  7. Active Portfolio Management for Liquid VC



As discussed in a previous post, in order to identify which factors were most significant in predicting token returns, we segmented the crypto market into two high-level categories:  ecosystem tokens and app tokens. We then regressed a robust set of independent variables against the change in asset price for each category. These independent variables are reproduced below, and comprise both macro and project specific factors.


Ecosystem Tokens

App Tokens


In this post, I will discuss how we use the most significant variables from this analysis as the basis of our data dashboards for each project category.

 

Using Regression as a Starting Point

In the crypto space most code is open source and everything that happens on the blockchain is transparent (barring zero knowledge and other technical obfuscation). This means that, with sophisticated data science capabilities, we can aggregate the most relevant metrics for each vertical we invest in, and track these metrics over time to better understand how use of different blockchains and decentralized applications evolve over time.

 

Using the above regression results, we identified key metrics for two high-level categories: ecosystem tokens and app tokens. We then further subdivided these categories into 19 specific verticals (DEXs, Derivatives, Lending, etc.). After successive rounds of regression for each vertical, we identified the fundamental metrics that have historically been most significant for a given vertical. These variables then inform which data we track for each vertical. We use the dashboards we create from this process to monitor growth and competitive position within each vertical.

Building our Data Dashboard Suite

 

For each vertical, we create a suite of data dashboards, which provide a comprehensive view of activity for that vertical, and a fundamental source of truth upon which we can base our investment and trade management decisions.

 

The output of our original regression work highlights the independent variables that are relevant to track for both the ecosystem and app token categories. By way of example, if we subdivide ecosystem tokens, we get the following verticals - Layer 0s, Layer 1s, and Layer 2s. We built unique set of internal dashboards for each of these verticals.

Below I will provide an example of how we use these dashboards to make data-driven decisions around capital allocation and investment sizing for the “Layer 2” vertical.

  

*A note before continuing - The below data dashboard case study focuses on the Layer-2 vertical for Ethereum. After completing this dashboard build, we realized the data insights would be valuable to the broader Ethereum community. We applied for and received an Ethereum Foundation grant to build out the interface further and open-source this dashboard. The resulting interface can be found at growthepie.xyz. You can find a screenshot of this dashboard below.


Layer 2 Case Study

 

With the proliferation of Layer 2 execution environments, decentralized applications that were formerly built on the Ethereum base chain have migrated to Layer 2 execution environments for faster settlement times and lower fees.

 

One type of Layer 2 execution environment is “Optimistic Rollups”, the leaders of which are Arbitrum and Optimism.

 

The first step in our analysis involves aggregating and cleaning data across Ethereum, Arbitrum, and Optimism to create a high-level view of how these different ecosystems are growing or contracting relative to one another.

 
Ethereum


Arbitrum and Optimism


Refining Our Analysis

As these different ecosystems begin to grow across the key metrics we identified during the “Regression” phase of our process, we further refine our analysis by understanding which activities are gravitating to certain environments over others. To create this view, we built a set of dashboards that parse out specific activities for each chain. This process was extremely manual, as we had to go through all smart contracts on each of these chains, and categorize each project using the following tags:

 

  • DeFi DEX

  • DeFi General

  • Utility

  • NFT

  • Stablecoin

  • Bridge

  • ERC20

  • Native Transfer

  • Arbitrage and MEV

  • CEX

 

We then fed this data into our dashboard suite for each chain, as shown in the images below.

 

Ethereum

Arbitrum

Optimism


Deriving Insights and Making Capital Allocation Decisions

 

Using the insights from these dashboards, we can determine which activity uses the most gas on each chain, which is a proxy for blockspace (i.e. in a blockspace auction, these activities bid higher for space, and are therefore more valued on that specific chain).

 

For example, if we see that DeFi activity on Arbitrum is growing at a faster rate than DeFi activity on Ethereum and Optimism, we will monitor this closely. If this trend persists, we then do a deep dive into DeFi projects built on Optimism. After writing a report on Optimism DeFi projects, we can compare them relative to one another and invest in the most promising projects in that vertical for that specific ecosystem.

 

Expanding Our Analysis Across All Ecosystems

 

Using the same process shown above for Layer 2s, we have built a set of high-level key metric views and fine-grained analysis dashboards for all of the verticals we track across the crypto space.

 

As the space grows, and activity gravitates toward different ecosystems and execution environments, we believe we will be able to make more rigorous, data-driven capital allocation decisions in advance of price movements across multiple verticals.

In my next post, I will discuss how we corroborate our data findings with a robust research process for each project.

Previous
Previous

Research Process for Liquid Crypto

Next
Next

Identifying significant variables that impact prices