Image credit: Dreamstime
Data by itself is not very useful. Data only becomes useful if it is understood and incorporates application experience.
This desire to put data to work has sparked a boom in cloud-based analytics. Although a relatively small amount of IT spending is currently moving to the cloud — about 6 percent according to IDC in 2020 — the overall momentum is moving away from on-premises, legacy business intelligence tools to more modern, cloud-native options like Google BigQuery, Amazon Redshift, Databricks, or Snowflake.
The popularity of bringing data and cloud together is evident in Snowflake’s rapid rise in DB Engines’ database popularity rankings, from 170th in November 2016 to 11th in January 2023. Part of Snowflake’s success is absolutely on performance, scalability etc. due to separation of memory and computing power and other benefits.
But an even bigger advantage is probably simply the cloud. Born in the cloud, Snowflake offers a natural path for organizations looking to move to the cloud. Yes, the same cloud keeps pushing new databases over legacy alternatives. The same cloud promises to further turn the world of data upside down in 2023.
All cloud, all the time?
Although I don’t entirely agree InfoWorld Colleague David Linthicum that “2023 could be the year of public cloud repatriation”, I can agree that we should not fall blindly in love with a technology or take it as a hammer and therefore treat every business problem like a nail.
The cloud solves many problems, but not all. However, in areas related to advanced data-driven applications, the cloud is indispensable, as Linthicum acknowledges: “When it comes to advanced IT services (AI, deep analytics, massive scaling, quantum computing, etc.), public clouds tend to be more economical. ”
Not only more economical, but also more practical.
AWS Manager Matt Wood brought this case to me years ago, and it’s just as compelling today as it was in 2015.
“Those who go out and buy expensive infrastructure are finding that the problem scope and domain is shifting very quickly,” he said. “By the time they get around to answering the original question, the business has evolved.” He continued, “When you put a lot of the change in a data center that’s frozen in time,” the questions that remain are you to your data, stuck in a time warp.
Even in tough economic times, looking at clouds is spot on when narrowing down the costs. An elastic infrastructure creates flexibility in the meaningful use of data. So to speak, dollars made sense instead of dollars and cents. These are cloud-based analysis tools.
Companies seem to understand that. Snowflake CFO Mike Scarpelli spoke at a recent analyst conference about the competitive dynamics in the data warehousing market. “We never compete with Teradata [an incumbent data analytics company founded in the on-premises software era]. When a customer has made the decision to go off-prem, it is never against Teradata. You have made the decision to leave.”
If the organization is already turning to the cloud for a digital transformation exercise, where are they looking? “According to Scarpelli: “When we compete for an on-premises migration, this is always the case [against] google, microsoft, [and] AWS [but AWS] tends to work with us more [out of] the gate.”
In other words, the customer has likely spent years with their on-premises data warehouse or BI solution, but they’re not betting their future there. Your future is in the cloud. If they’re considering a next step, it probably won’t be Oracle unless they’re so ingrained in Oracle that adopting a new system seems difficult.
Most often, companies are looking for a cloud-based database, data warehouse/lakehouse, or machine learning/artificial intelligence system. So more Google BigQuery and less SAP BusinessObjects.
democratize data
Another reason for the success of the cloud is simplicity, or it can be. The cloud isn’t inherently more user-friendly, of course, but many cloud systems have emphasized a SaaS approach that prioritizes user experience.
Take, for example, this comment from a Reddit board describing his experience with Snowflake: “If you need a Ph.D. in physics to use your SaaS tool, your tool is useless. MySQL users love it (analysts), the C-suite loves it, the only people it needs to win over are the nerdy engineers like me who had enough hubris to think they could do it all themselves and everyone on the World would learn PySpark day.”
I recently wrote about data democratization, how companies are trying to give more employees access and the ability to work with more and different data. I found that if companies really want to democratize data, they need to teach their employees how to effectively use cloud-based tools to explore cloud-based data.
Luckily, the cloud also allows machine learning systems to take on some of the heavy burden. As my MongoDB colleague Adam Hughes writes, “The combination of real-time, operational, and embedded analytics—what some call Translytics, HTAP, or advanced transactional databases—now enables analytics based on application data to drive, inform, and leverage decision-making to automate for the app and provide real-time insights for the user.”
That doesn’t mean that machines do the thinking for us, but that they do away with the undifferentiated heavy lifting of computationally intensive data processing, leaving the more thoughtful work of understanding what that data is for an application and, ultimately, businesses, to the user.
All of this is not entirely powered by the cloud, but is absolutely enhanced and accelerated by the cloud. Data has never been more important, and thanks to cloud computing, accessing and understanding data has never been easier. If you want to make a near-certain prediction for 2023, it’s that this trend will continue and accelerate.

Tags analysis