Data to Value Trends. PART 2. TRENDS #2-4. (NEXT WEEK WE WILL RELEASE THE FINAL 3 trends we are highlighting)
Welcome to our Snowflake Solutions Community readers who have read Part 1 of this Data to Value 3-part series. For those of you who have not read part 1 and want to fast forward… We are making a fundamental point that data professionals and data users of all types need to be focused NOT just on creating, collecting, and transforming data. We need to make a cognizant effort to focus and measure WHAT is the TRUE VALUE that each set of data creates. Also, we need to measure, how fast we can get to that value if it provides any real business advantages. There is an argument to also alter the value of the data that is time-dependent since it loses value sometimes the older it is.
Here are the trends we are seeing related to the improvement of Data to Value. Some of my favorites that are revolutionizing how data moves rapidly with more QUALITY to value for HUMANS and their ORGANIZATIONS:
Trend #1 – Data to Value – Non-stop push for faster speed. (This was covered in the previous articles)
Trend #2 – Data Sharing. More and more Snowflake customers are realizing the massive advantage of data sharing allowing them to share “no-copy,” in-place data in near real-time. Data Sharing is a massive competitive advantage if set up and used appropriately. You can securely provide or receive access to data sets and streams from your entire business or organization value chain which is also on Snowflake. This allows for access to data sets at reduced cost and risk due to the micro-partitioned zero-copy securely governed data access.
Trend #3 – Creating Data with the End in Mind. When you think about using data for value and logically think through the creation and consumption life cycle then data professionals and organizations are realizing there are advantages to capturing data in formats that are ready for immediate processing. If you design your data creation and capture as logs of data or other outputs that can be easily and immediately consumed you can gain faster data-to-value cycles creating competitive advantages with certain data streams and sets.
Trend #4 – Automated Data Applications. I see some really big opportunities with Snowflake’s Native Applications and Streamlit integrated. Bottom-line, there is a need for consolidated “best-of-breed” data applications that can have a low-cost price point due to massive volumes of customers.
Details for these next 3 are coming next week 🙂
Trend #5 – Full Automated Data Copying Tools. I have watched the growth of Fivetran and Stitch since 20-18. It has been amazing. Now I see the growth of Hightouch and Census as well which is also incredibly amazing.
Trend #6 – Coming next week
Trend #7 – Coming next week
Snowflake’s Announcements related to Data to Value
These are currently the same as the ones I discussed in last week’s articles. I’m waiting for some of my readers to see if you have any other Snowflake Summit Announcements that I missed that are real Data to Value features as well!
Snowflake is making massive investments and strides to continue to push Data to Value. Their announcements earlier this year at Snowflake Summit have Data to Value feature announcements such as:
*Snowflake’s support of Hybrid Tables and announcement of the concept of Unistore – The move into some type of OLTP (Online Transaction Processing). There is huge interest from customers in a concept like this where that single source of truth thing happens by having web-based OLTP-type apps operating on Snowflake with Hybrid tables.
*Snowflake’s Native Apps announcements. If Snowflake can get this right it’s a game changer for Data to Value and decreasing costs of deployment of Data Applications.
*Streamlit integration into Snowflake. Again, if Snowflake gets this right then it could be another Data to Value game-changer.
***Also note, these 2 items above are not only that data “can” go to value faster, but also that the development of data apps and the combination of OLTP/OLAP applications are much less costly and more achievable for “all” types of companies. They could remove the massive friction that exists when having high-end full-stack development. Streamlit is attempting to remove the Front-End and Middle Tier complexity from developing data applications. (Aren’t most applications through data applications?). It’s another low-code data development environment.
*Snowpipe streaming announcement. (This was super interesting to me since I had worked with Isaac from Snowflake back before the 2019 Summit using the original Kafka to Snowflake Connector. I also did a presentation on it at Snowflake Summit 2019. It was awesome to see that Snowflake refactored the old Kafka connector and made it much faster with lower latency. This again is another major win around Steaming Data to Value with an announced 10 times lower latency. (Public Preview later in 2022)
*Snowpark for Python, Snowpark in general announcements. This is new tech and the verdict is still out there but this is a major attempt by Snowflake to provide ML Pipeline Data to Value speed. Snowflake is looking to have the full data event processing and Machine Learning processes all within Snowflake.
Summary
This article is part of my Frank’s Future of Data series I put together to prepare myself for taking advantage of new paradigms that the “Snowflake Data Cloud” and other “Modern Data Stack” tools/clouds provide. If you read my initial Data to Value Article then these Snowflake items around Data to Value are the same as the first article. Do you have any others that were announced at Snowflake Summit 2022? I hope you found this 2nd article around Data to Value useful for thinking about your data initiatives. Again, focusing specifically on Data to Value can help you prioritize and simplify what is truly most important for your organization! Good Luck!