Snowflake: The Market Is Already Pricing In Long-Term Success

Source: Snowflake Press and News

Snowflake (NYSE: SNOW) is a class innovator given its technology and will likely continue to grow at tremendous rates based on its overall business excellence. The company commanded a hefty valuation from the primary market and is currently trading at more than double its IPO price at an EV/S of ~175x. Does anyone remember a time when a software IPO with a 25x EV/S multiple was considered expensive? We’re well over 100x now.

In this article, I’ve attempted to unpack the modern data needs that drive Snowflake’s product-market fit and the business model that has turned it into an organic sales growth juggernaut. At this post-IPO, cash-burning stage, I reckon the core product/tech should be given higher weight in a thesis and helps contextualize why the IPO price was high in the first place. Despite what I see as an excellent business and product, I’m staying behind the sidelines. In my opinion, the valuation multiples are pricing in the company’s long-term success already without offering a premium to compensate for risks and a lack of visibility. I think the stock should be avoided for now.

Premise

Snowflake’s all-encompassing product is its Cloud Data Platform. While there are quite a few categories of data workloads the platform enables as shown by the infographic below, I’d classify Snowflake as a data management platform at its core. It is well-known that enterprises worldwide are relying heavily on data as a central component of how they think and do business. The amount of data in the average enterprise is scaling exponentially, the workloads and use cases for the data are increasing, the types of data are more diverse than before, and the need to make sense of it all is greater than ever. This comes with a host of governance and security problems that need to be addressed while businesses continue migrating to the cloud. It’s therefore not far fetched to say that managing all that data effectively is mission-critical and having a solution that can scale with the increasing complexity and future requirements of enterprises is pretty important. Snowflake directly tackles these challenges.

Source:

Traditional data management tools, as well as innovators that have come in the past 10 years, have had to keep up with ever-changing requirements. Historically, the tools have fallen far short of a frictionless experience for users with several shortcomings. I’d like to first walk through these issues by talking through the evolution of data management solutions. I believe this is particularly relevant to understanding how Snowflake has innovated and why it solves the problems it does. It also provides some colour to the high IPO valuation multiple as I’ll remark upon later, that can be partly attributed to some great tech and execution.

The Evolution Of Data Warehousing

Data can be structured, unstructured, or semi-structured. Structured data such as names and addresses can be stored in neat and organized formats such as tables. Majority of the data in the past was of this structured nature with databases and data warehouses used to store this format of data on-premise. This gave rise to enterprise data warehouses (EDWs) back in the day.

Source: Wikimedia Commons

The “ETL” in the above diagram stands for extract, transform, and load. ETL is a process that copies data from one place to another in a form that is arranged differently according to its destination requirements and context. As end-users or specific business units need relevant data in forms tailored to their functions, “data marts” are used as intermediaries. EDWs were fine for structured data workloads but not so much for semi-structured (CSV, or JSON files) or unstructured data (videos, images). On-premise solutions meant that storage and compute was limited so enterprises needed to plan well in advance how to deal with resources and maintenance. While some obvious use cases for data could be sales and financial reporting, we now have analytics, machine learning, and AI workloads across both structured and unstructured data. This further stresses traditional storage and compute architectures for resources, queueing, and latency.

We saw the rise of data lakes as a solution to the issues created by semi-structured and unstructured data. As a warehouse suggests organized items with each item in its respective place, a lake implies a fluid mixture of items. Data lakes allowed for raw data of all types to be stored in their original formats in a single repository. The technology that enabled these data lakes was originally based on the Apache Hadoop ecosystem, open-source software that enables scalable, distributed computing. Onboarding data to lakes didn’t necessarily mean that lakes could be used easily to produce results and insights. Navigating, orchestrating, and extracting data from data lakes proved to be complicated and required specialized skills and other resources to prevent them from turning into data swamps. Hadoop also had simplistic controls on governance, security, and privacy challenges that arise with data.

Traditional data warehouses and data lakes eventually moved towards the cloud. Storage and compute resources were more effectively utilized with big providers like Amazon AWS (NASDAQ: AMZN), Microsoft Azure (NASDAQ: MSFT), and Google’s Google Cloud Platform (NASDAQ: GOOGL) offering better levels of optimization and scaling of resources. For the common enterprise, dealing with hardware was mostly eliminated as we saw the rise of Infrastructure-as-a-Service (IaaS). Software, on the other hand, has had to continuously evolve to enable workloads, security and governance, control over access, encryption, and more. Then there’s also the factor that all cloud-based data warehousing isn’t designed the same. Replicating enterprise-level architectures and simply transferring to the cloud leaves a lot for optimization. Latency, concurrency, and price-to-performance that is realistically achievable need to be maximized to deliver optimal value for enterprise users. Unlocking the true value from data that empowers businesses requires the elimination of friction.

This is where Snowflake comes in with a modern cloud data platform offering. It isn’t an infrastructure provider by itself but a platform that leverages the big three’s infrastructure to deliver data management of all kinds across a single unified platform. This encompasses data warehousing and data lakes amongst other things through a cloud-purpose designed architecture. Snowflake is therefore an enabler of everything you need to do with data in a highly-optimized fashion.

Snowflake & Next-Gen Warehousing Architecture

Source: Snowflake

The company’s forward-looking solution is a unified cloud data platform architected specifically for the cloud. Snowflake offers a solution with the following elements:

  • Handles diverse data types
  • Scalable across massive data volumes
  • Enables multiple-use cases and concurrency
  • Offers optimization for price-to-performance
  • Is easy to use
  • Multi-cloud and Multi-region
  • A Pay-as-you-need pricing model
  • Seamless and secure data sharing

Source: Snowflake’s S-1

While popular news has referred to Snowflake as a data warehousing solution, the company has called its main offering as a cloud data platform as it is expanding beyond traditional storage and compute. It is important to note that Snowflake is not an infrastructure provider by itself. It is a layer that rests upon existing public cloud providers AWS, Azure, and GCP. At the heart of Snowflake’s platform is an architecture that differentiates it and solves a lot of the problems faced by previous-gen solutions.

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture delivers the performance, scale, elasticity, and concurrency today’s organizations require.

Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute, and global services layers that are physically separated but logically integrated. Data workloads scale independently from one another, making it an ideal platform for data warehousing, data lakes, data engineering, data science, modern data sharing, and developing data applications.

Snowflake Website

By centralizing storage, users have a single source of truth for all their data. By using a decoupled multi-cluster compute layer, compute resources can be elastic and scale across different workloads as needed. The services layer optimizes everything else and provides a layer of control to all sorts of facets of managing the data. While these three layers make some intuitive sense for us investors, there is some hardcore engineering involved to achieve this architecture and have it function at scale at the performance it does. Specifically, separating storage and compute has minimized resource contention challenges and helped concurrency dramatically. What took several hours in paired storage-compute on-premise architectures, takes minutes on Snowflake. The management team is littered with PhDs who’ve specialized in parallel processing database systems and optimization for most of their careers. While the technical stuff is beyond me, the results their customers claim are something to go by. I recommend browsing through customer studies and this commissioned, but independent Forrester Study on ROI of the platform.

The end goal of data management solutions should be to remove all friction involved in using data. Snowflake appears to be a leap forward in that direction and the results show. Companies claim accelerated time to market, increased profits, improved decision making, and significant cost savings on infrastructure management when compared to previous-gen solutions. The cloud data platform isn’t just helping manage data needs but is actively contributing to a better, faster, and more well-informed business for its customers. This is partly why it is so highly admired in Silicon Valley as the second-order effects for adopting the platform are far-reaching and influential across both quantitative and qualitative aspects of businesses.

Business Model & Market Opportunity

Snowflake is a customer to AWS/Azure/GCP and therefore their cost of revenue and gross margins depend upon their ability to negotiate with the big three for infrastructure services. This relationship with the major IaaS providers is discussed in further detail in the next section.

Source: Snowflake Website

The cloud platform is offered through a highly flexible pricing model. Billing is calculated based on usage for storage, compute, and services. Storage charges are based on monthly fees determined by the amount of data that needs to be stored while computing and cloud services are paid for by “Snowflake credits”. Credits can be purchased and then encashed for compute services quickly as and when needed. Cloud services (the services layer) are mostly free up to a threshold, after which the customer is charged extra.

The key takeaway here is that as data required by organizations exponentially increase, the storage and compute resources as well. There is a natural tailwind to revenues here at play which is almost irreversible for organizations that continue to use Snowflake. You can think that 10 years from now, even if Snowflake isn’t adding many more customers, the terminal revenue growth rate would be still high as it is a function of the data and compute requirements and not seats or employees.

Investors should note the differences between a predictable subscription model and this elastic model. The typical SaaS subscription business model has commanded a hefty premium from the markets due to revenue predictability and visibility. This model is better in my opinion and while it does away with recurring revenue predictability to some extent, it is riding the inevitable trend of increasing data in the enterprise. This is mainly why Snowflake is enjoying some phenomenal dollar-based net retention rates and will continue to do so.

The company estimates its current market opportunity or in other words, its Total Addressable Market (NYSE:TAM) at $81B according to its latest S-1 filing. The estimate multiplies their current recurring revenue over three customer sizes and multiplies it to every business over 200 employees in the S&P Capital IQ database. This should be taken with some skepticism in my opinion and investors should know that there’s currently little visibility to inform us on Snowflake’s likelihood to capture most of their TAM over the company’s lifetime. Several competitors shall inevitably arise, especially with the big three IaaS providers directly competing with Snowflake while also selling them their services. For companies as young as Snowflake, greenfield growth needs to be balanced with competitive threats.

Cooperation Or Competition?

G2 Crowd is a popular community review site. I’ve previously found their G2 Grid’s quite telling as an alternative data source for unbiased product/quality and sales momentum. Snowflake boasts the highest satisfaction ratings from the community and is a leader in the industry, up there with Amazon and Google. The Momentum Grid aggregates the rate of change from community reviews and other third-party data. Snowflake is well above the rest of the competition in momentum, which doesn’t come as a surprise given their excellent sales growth.

Source: G2 Grid for Data Warehouse

Source: G2 Grid for Data Warehouse, Trending

Gartner Peer Insights, TrustRadius, and Capterra, also provided some interesting community-based insight with Snowflake consistently scoring among the highest in customer satisfaction ratings.

The main competitive threats I can foresee are the big three cloud infrastructure shops. Amazon has AWS Redshift, Google offers BigQuery, and Microsoft offers Azure SQL that directly compete with the kind of data warehousing solutions Snowflake offers. If Snowflake currently has a technological moat that lends it an advantage over the short term, the other three are stacked with resources to hire the best engineers possible to replicate a multi-cluster architecture and SQL interfaced data management system. There’ll likely be enough intelligence to workaround Snowflake’s patents without infringing on them and produce great products soon enough.

The relationship between Snowflake and the three is symbiotic and competitive at the same time. As more business end up on Snowflake, the company is actively taking away enterprises from directly purchasing infrastructure services through the three for higher prices. Part of the reason Snowflake can work is that it can negotiate infrastructure at wholesale prices and re-offer it with their layered architecture to their customers. I highly doubt that Amazon is going to let Snowflake take the whole $81B and growing TAM without a fight. The other interesting chess match is that if Amazon raises their prices for Snowflake as a customer, they might lose out on Snowflake’s significant and growing business to Azure or GCP. There’s a form of game theory here at play and Snowflake’s prominence itself pits the big three against each other making for an interesting competitive dynamic. This brings us back to what strategies might work for the big three. If I had to guess, it is in their best interests to evolve their data management offerings and cut out the middleman which in this case in Snowflake.

From a customer perspective, preventing vendor lock-in is also a factor in deciding which service to choose for data warehousing. This is where Snowflake’s cloud-agnostic architecture could be a selling point. As we can see, this makes for a complicated competitive landscape. It will be interesting to see Redshift, BigQuery, and Azure SQL evolve in the coming years. While Snowflake might experience some more green-field growth in the short run, I don’t think it will be long before we have more intense competition.

I’d like to point out to readers this is neither a zero-sum game nor a win-win situation for Snowflake and the IaaS providers. It is in between, and long-term success will be based on the decisions and actions the parties involved take with how they’ll compete. These factors are likely why pricing the Snowflake IPO has been so difficult by the bankers. MongoDB for example has thrived while the IaaS providers have also had similar NoSQL offerings for the past few years. It isn’t unreasonable to expect the same for Snowflake, though it appears more directly competitive and ambitious with its overarching “Data Cloud” vision.

Growth Metrics & Financials

Key Performance Metrics

Source: Author, Data from Snowflake’s S-1 Filing

The last four quarters have experienced accelerated growth on a quarter-over-quarter basis. Product revenues grew +117% yoy and +23%qoq for Q2 FY21 (ending late July), which is excellent given some notable macro headwinds in the industry. There’s no sign of slowing down, though the growth rates will eventually be limited by the expanding revenue base so we can expect some natural deceleration eventually. The trend going off of revenues is a slow deceleration, which can be partly attributed to their pricing model and expansion over the amount of data storage and increasing compute resources from existing customers. The net retention rate stands at an impressive 158%, providing a very high floor to revenue growth before accounting for new customers.

The successful IPO should provide the company with significant ammunition to expand sales and marketing at full speed to capture their market opportunity. RPOs came in strong last quarter and are a leading indicator of substantial revenue to be accounted for in the coming months. Note that there’s some seasonality here as Q2s and Q4s appear to be stronger. All in all, I’m expecting high growth to continue.

Source: Author, Data from Snowflake’s S-1 Filing

Customer growth is also impressive, displaying some seasonal Q2 strength and strong net adds despite the pandemic situation. Big contract customer (>$1m TTM revenue) are outpacing the broader customer base, indicating that Snowflake is gaining traction from up-market customers and large enterprises. This should provide a natural tailwind to keeping the dollar-based net retention rate high. Overall, the growth metrics paint an excellent picture making the company one of the fastest-growing public software businesses.

Financial Performance

Source: Author, Data from Snowflake’s S-1 Filing

Note that Free Cash Flow was calculated as “Operating Cash Flow – CapEx”. “SBC” or Stock-Based Compensation was as reported from the S-1 Filing. It was subtracted from the GAAP Operating Profit to give a Non-GAAP Operating metric that can be compared across periods to see core operational improvement.

The chart above displays the 6 months ended July 31, 2020, versus that of a year ago. Margins have substantially improved across the board. The FCF burn is fair considering the triple-digit revenue growth rate. Even though Snowflake’s platform rests on other infrastructure, the company has managed a commendable 62% gross margin over the last six months. This is a key metric that should be viewed closely going forward as we aren’t quite sure yet where it can level off and normalize. I certainly don’t expect an 85%-90% gross margin like the typical SaaS company. A diminishing trend on the gross margin expansion would mean fewer flows to the bottom line, indicating that easy margin expansion will be coming to an end. Sales and Marketing for H1 FY21 consisted of 79% of Revenue. This is a substantial figure and I expect it to stay high in the coming year.

On a TTM basis, Revenue stands at $402.7m, clocking an impressive 133% yoy growth. The stock performance will likely be heavily top-line driven going forward as it has been for other software IPOs. That brings us to Snowflake’s rich valuation.

Valuation & The Time Value Of Inevitability

Snowflake currently trades around a 175x EV/S multiple. Let’s assume that the company grows to rule its domain and becomes a Salesforce (NYSE: CRM) or Adobe (NASDAQ: ADBE) in 10 years. What sort of growth rate would that need, based on what sort of compounded rate of return? I’d like to explore two cases. One in which $120 is a fair price, and the other in which $250 is a fair price.

What if $120 is a fair price?

My intention for this exercise is to explore what the primary market may have been thinking versus what the stock market is pricing in. If I was a primary market investor (subscriber at $120) and felt extremely confident on the company’s long-term prospects, I’d want at the very least a 20% annual rate of return over a long-term holding period. At the current $250 price, you’d only get an 11.5% annual rate of return to match the $120 buyer’s 20% compounded return appreciation over the 10 years. Over a shorter period, the 11.5% return equates to far less.

Source: Author, Author’s Calculations

Given these assumptions, the current stock price has already accounted for the next four years of returns for a $120 investor. If the primary market was right, there’s no justification to hold at $250. Granted, the stock market doesn’t move in smooth curves as shown above and the risk of owning the stock declines as it grows into a blue-chip. If the holding period was 5 years, and the $120 price demanded a 30% annual return for the higher risk given this period, we’d still only receive a 12.5% return from $250 to match. In this case, the market is pricing in just under 3 years of returns.

Source: Author, Author’s Calculations

What if $250 is a fair price?

If $250 is fair for an investment, then the primary market investors received an absolute bargain. Going back to the assumption that Snowflake turns into an Adobe/Salesforce in 10 years, I’d like to contextualize this price in the more traditional EV/S multiples we’ve come to know. ADBE and CRM trade at roughly 18x and 12x EV/S respectively. We can take a longshot assumption that SNOW will trade at the high-end of that range of 18x EV/S in 10 years. Note that Adobe is a virtual monopoly with incredible profitability and is still growing fairly, whereas Salesforce is the undisputed leader in Customer Relationship Management. So these are some real, industry-domination assumptions. The reality check now becomes: How much do sales need to grow to justify a 20% rate of return?

  • At $250, EV stands at ~$71.2B at Year 0 (currently); A 20% compounding rate would produce an EV of $441B at Year 10 (mid-2030)
  • Applying an 18x EV/S multiple, Sales would need to be $24.5B in Year 10
  • The current TTM Sales are $402.7m; The assumptions require Sales to multiply by ~61x from current levels by Year 10

One might ask, what does a 61x sales growth pattern over 10 years look like? I attempted to plug in some decelerating growth percentages to arrive at the following purely hypothetical chart:

Source: Author, Author’s Calculations

If the above chart comes to be more or less real, we’re looking at an 18x EV/S multiple in 10-years with a 20% annualized return if the stock is bought at the current ~$250. One should note that slight changes in a few growth percentage numbers could lead to drastically different results. Something resembling the above growth rates is theoretically possible if the company soars at full speed, sells its products at a near viral rate, constantly innovates, and becomes synonymous with data management as Salesforce is with CRM, or Adobe is with Creative Software. Such growth, however, commands flawless execution in my opinion and is difficult to attain despite the natural tailwinds and potential competition. Unfortunately, only a hyper-bullish scenario would yield an acceptable 20% return. At this level, I’d like a larger discount to the price considering the risk.

Takeaways from Valuation

Investors should note that even inevitability, or “next-big-thing” companies, have a time value and rate of return attached to them. When the market is already pricing in success, especially at this very nascent IPO stage, you aren’t being compensated enough for risk, volatility, drawdowns, and the very real possibility that the business falls short. Snowflake appropriately commands a high valuation for its business. I must admit, that I too scoffed at the price before diving deeper into the company. Upon further analysis, there seems to be more justification for its valuation that I previously hypothesized. It is stretched, but not absurd. There’s more downside than upside from here in the short-term and the price relative to the long-term prospects isn’t attractive enough to warrant a spot in my portfolio. I’m staying behind the sidelines.

Risks

Upside Risks: Snowflake revenue growth continues to accelerate (or stays well above 100% yoy); we push forward into a higher valuation-multiple paradigm; competition is overstated and is far off from Snowflake’s moat for the near future

Downside Risks: Systemic risks and large drawdowns considering the high-growth nature, competition catches up soon, macro risks, the lock-up period ends with a large institutional sell-off, IaaS providers raise their prices and apply gross margin pressure, revenues decelerate faster than expected

Conclusion

I see the current price as a chance to sell Snowflake if you hold it. Even if the company does succeed spectacularly, shareholders are not being compensated enough for the risk they’re taking in my opinion. It remains an exceptional company with a strong moat and an all-star management team. The market appropriately recognizes that it is best-of-breed but the valuations have unfortunately gotten ahead of what is reasonable. Its wise to wait and watch until the lock-up period expires.

Disclosure: I am/we are long MSFT. I wrote this article myself, and it expresses my own opinions. I am not receiving compensation for it (other than from Seeking Alpha). I have no business relationship with any company whose stock is mentioned in this article.