A tale of two events: Inside Snowflake’s and Databricks’ marquee events

Reading Time: 2 minutes

The last few weeks have been nothing short of spectacular, with Snowflake and Databricks hosting their marquee events in Las Vegas and San Francisco respectively. As the epic clash that would shape the future of the data landscape drew near, Fosfor®geared up to put its best foot forward at both these monumental events.

The simultaneous timing of the events did raise some eyebrows in the industry, and soon it became evident that a fierce competition was unfolding between the two powerhouses, both vying for a similar target audience and a larger partner ecosystem. While Snowflake’s spectacular show in Las Vegas boasted over 12000 confirmed attendees, there was an equally palpable excitement in the air for the Databricks event happening in San Francisco.

The following is a narration of everything that unfolded over this period for you to experience vicariously through my eyes.

Clash of the titans- Competing for dominance in the data ecosystem

In case you missed it, in the run-up to the events, Snowflake and Databricks had upped the ante by making acquisition announcements each. Snowflake stole the spotlight with its acquisition of Neeva, a Generative AI-based search tool, while Databricks announced its acquisition of MosaicML.

The keynote speeches were the highlight of each event, providing attendees with a glimpse into the companies’ future directions. The Snowflake keynote featured the NVIDIA founder – albeit a surprising move, it signalled Snowflake’s foray into core database language infrastructure[dare I say, LLMs are not at play here!]. Databricks, on the other hand, delighted attendees by featuring none other than Satya Nadella himself in their keynote, showcasing the strong collaboration between Microsoft Azure and Databricks.



Fosfor also made it into the Snowflake opening keynote in style, being listed as one of the key application providers for their newly launched Snowpark Container Services. The joy and pride that swept through me, I’m sure resonated with all the other Fosforites and LTIMindtree folks in the hall. A validation as big as this established that we, as a product company, had arrived at a momentous point in our timeline!

The new container services expand the scope of Snowpark to include broader infrastructure options to run more workloads within Snowflake’s Data Cloud.This includes a wide range of AI and Machine Learning models, with additional access to a catalog of third-party software and applications including LLMs, notebooks, machine learning (ML) operations tools, and more within their accounts.

The great data landscape showdown- the DataCloud andbeyond

The major announcements made by the two behemoths, Snowflake and Databricks, shed light on the new world they are collectively venturing into—the realm of the DataCloud. It has become clear that their paths are converging, as Snowflake (known for its focus on BI workloads) and Databricks (renowned for excelling in analytics workloads) strive to become the go-to platforms for “All Data – All Workloads.”

Was this perhaps a compelling reason for the simultaneous event scheduling, or was it a mere coincidence?

Here are some deeper observations:

  • Workloads: Both Snowflake and Databricks showcased their commitment to becoming comprehensive data platforms capable of handling all types of workloads. Snowflake introduced the Snowpark Container Services and highlighted their partnership with NVIDIA, emphasizing their focus on analytics workloads. Databricks, on the other hand, emphasized the growing traction of its DB SQL workloads and reiterated the concept of the Data Lakehouse.
  • Generative AI: Both organizations recognized the tremendous potential of Large Language Models (LLMs) and their application in advancing data capabilities. Snowflake’s collaboration with NVIDIA, specifically through Snowpark Container Services, exemplified their focus on harnessing LLMs for search, as well as advanced analytics workloads.

Databricks, on the other hand, introduced Lakehouse IQ and showcased what else is possible in the GAI realm with MosaicML, adding an interesting spin to all of this. They strongly believe training LLMs will not be a costly affair.

The bigger picture here though, is that just like how both Databricks and Snowflake needed an easy way for users to create data applications, they now need an easy way for their users to create Generative-AI models to host and monetize those applications on their respective marketplaces.

  • Data and app marketplace: In the battlefield of innovation, both organizations acknowledged the importance of providing a robust platform for data and application sharing. Snowflake seems to have an edge here with their industry Data Cloud pitch, and further solidified their position with various showcases around the Data and App Marketplace. Databricks, never to be outdone, responded with their own marketplace, intensifying the race to capture the attention of developers and businesses seeking comprehensive data solutions.
  • Ecosystem play: One of the most intriguing aspects of this rivalry was the battle for ecosystem dominance. Both showcased their robust partnerships with key players in the modern data stack. Snowflake seems to be having an edge here as well with their notable tech partner line up, which includes DBT Labs, Dataiku, Alation, and Hex.With Snowflake as an investor for all these brands, they had some of the largest booths on the floor! Adding to the intrigue, some of these players – DBT, Alation, and Hex – had Databricks backing them as an investor too. Overall, the product partnership play is a strategic battle of tech fitment and product alignment, fuelled by compelling go-to-market strategies.
  • iPhone vs. Android: The Snowflake CEO mentioned in his keynote that Snowflake is his version of the iPhone for the data world. Now that’s a clear indication of how the Snowflake strategy is set to play out. Streamlit, which was a major acquisition last year is getting integrated into the system providing the capability to build native Snowflake-powered apps. This enables users to enjoy an end-to-end infrastructure to app stack. While there is little visibility into how the whole engine works, industry experts believe that if it delivers on its promise, then Snowflake might just become a monopoly. Snowflake advertised this by saying that they will take care of the infra, elasticity, security, governance, and everything else that may be hurdles for businesses in their data journeys.
    In contrast, with Databricks, where everything is open source, you get to see how the data is stored (very Android-like, keeping with the analogy). Databricks even went a step further by removing the barriers between different data formats and released Delta Lake 3.0 and UniForm.

An unexpected connection

Despite their fierce competition, Snowflake and Databricks share an unexpected connection. Snowflake made working with Iceberg tables easier, and Databricks showcased Snowflake in their announcement slides at the Lakehouse Federation. This means that even if the events happened on the same days – they are still friends? 😊

What does all this mean for Fosfor?

The resonance of Fosfor’s Decision Cloud was evident throughout the events, with the Snowflake team presenting joint product launches at the Fosfor booth. The event attendees were the exclusive few to get a sneak-peek into the next phase of the Fosfor-Snowflake partnership. This phase of the partnership will see the Fosfor Decision Cloud being integrated even more deeply with the Snowflake Data Cloud to provide ground breaking capabilities that empower organizations to accelerate their adoption of Artificial Intelligence (AI) and Machine Learning (ML) workloads, and optimize decision intelligence capabilities at enterprise scale.

At the Databricks event in San Francisco, Lumin stole the show with its transformative power of



Generative-AI and its focus on core industry solutions, and faster business outcomes.

To stay ahead in the rapidly evolving Data Cloud landscape, we are deeply integrated with both these tech giants, fortifying our partner ecosystem. We see a future where these strategic partnerships empower us to become a clear leader in the Decision Cloud space, helping our clients to make informed and impactful decisions.


Sandeep Acharya

Head- Product Engineering, Fosfor

A seasoned, hands-on professional with deep experience in the various facets of software development, spanning the “old-world” and the “digital age” of the technology world. A high performing leader with a proven track record of accomplishments in a wide array of engineering solutions in large & small team setups, Sandeep currently leads product engineering for the Fosfor product suite.

Sandeep is an expert in nurturing strategic long-term partnerships andmanages effective communication with all levels of decision-makers and executioners, focusing on critical success factors and managing customer delight.

Latest Blogs

See how your peers leverage Fosfor + Snowflake to create the value they want consistently.

5 ways to foster data curiosity in your business

Just as early humans survived on curiosity to discover fire and invent the wheel, today’s organizations built on data need to know their data terrain well to survive and stay on the top. They need to understand what data resources are available to them and what challenges they face from disruptors, who always keep them on their toes.

Read more

Accelerate your production ML journey with Refract

As we all know, production ML (Machine Learning) is more engineering than machine learning. Building a prototype in machine learning has become very simple nowadays, all thanks to different open-source projects like sci-kit, TensorFlow, Keras, etc. But operationalizing that model to get the insights from the model which can be used in day-to-day business decisions is challenging and needs more engineering knowledge than data science knowledge.

Read more

After-market sales & service: Warranty analytics for HVAC manufacturers

For manufacturers seeking to improve their financial performance and customer satisfaction, the quickest route to success is often a product-quality transformation focusing on reducing warranty costs. When fulfilled while backing exceptional product quality, product warranties positively impact sales, profitability, and customer loyalty. On the other hand, when warranties are backed by sub-optimal products, it results in large warranty reserves that directly impact the organization’s profitability. This negatively impacts the organization’s brand, Image, and equity.

Read more