Data to Value

Data to Value – Part 1.  I spend a ton of time reviewing and evaluating all the ideas, concepts, and tools around data, data, and data.  The “data concept” space has been exploding with an increase of many different concepts and ideas.  There are so many new data “this” and data “that” tools as well so I wanted to bring data professionals and business leaders back to the core concept that matters around the creation, collection, and usage of data.  Data to Value.

The main concept is that we need to remember that the entire point of collecting and using data is to create business, organizational, or individual value.  All the other technical details and jargon between the creation and collection of the data to the value realization is important but for many users it has become overly complex especially many of the “latest concepts”.

For a short moment, let’s let go of all the consulting and technical data terms that are often becoming overused and often mis-used like Data Warehouse, Data Lake, Data Mesh, Data Observability, Data THIS and Data THAT.  Currently I’m even seeing that data experts and practitioners will have different views around the latest concepts depending on where their data education began and with the types of technologies they used.

Data to Value is what really matters

This article is part of my Frank’s Future of Data series I put together to prepare myself for taking advantage of new paradigms that Snowflake and other “Modern Data” Stack tools/clouds provide.  Before I started my Snowflake Journey I was often speaking around the intersection of Data, Automation, and AI/ML.  The intersection of cloud, data, automation, and ai/ml is having massive impacts on our society.

Data to Value Trends

Back in 2018, I had the opportunity to consult with some very advanced and mature data engineering solutions.  A few of them were actively moving with Kafka/Confluent towards true “event-driven data processing”.  It was a massive shift from the traditional batch processing used throughout 98% of implementations I had worked on previously.  The concept of using non-stop streams of data from different parts of the organizations delivered through Kafka topics I thought to be pretty awesome.  At the same time it was some pretty advanced concepts and paradigm shifts at that time for all but very advanced data engineering teams.  Here are the Data to Value Trends that I think you need to be aware of:

 

Trend #1 – Non-stop push for faster speed of Data to Value.  Within our non-stop dominantly capitalist world, faster is better and often provides advantages to organizations especially around improved value chains and concepts such as supply chains.  Businesses and organizations continuously look for any advantage they can get.  I kinda of hate linking to McKinsey for backup but here goes.  Their characteristic #2 for the data-driven enterprise of 2025.  “Data is processed and delivered in real time”

 

Trend #2 – Data Sharing.  Coming next week – Part 2.

Trend #3 – Coming next week – Part 2.

Trend #4 – Coming next week – Part 2.

Trend #5 – Full Automated Data Copying Tools.  The growth of Fivetran and Stitch (Now Talend) has been amazing.  We now are also seeing huge growth at automated data copy pipelines going the other way like Hightouch.  At IT Strategists, we became a partner with Stitch, Fivetran, and Matillion back in 2018.  Coming in 2 weeks – Part 3

Trend #6 – Coming in 2 weeks – Part 3

Trend #7 – Coming in 2 weeks – Part 3

*What data to value trends am I missing?  I put the top ones I see but hit me up in the comments or directly if you have additional trends.

Snowflake’s Announcements related to Data to Value

Snowflake is making massive investments and strides to continue to push Data to Value.  Their announcements earlier this year at Snowflake Summit have Data to Value feature announcements such as:

*Snowflake’s support of Hybrid Tables and announcement of the concept of Unistore – The move into some type of OLTP (Online Transaction Processing).  There is huge interest from customers in a concept like this where that single source of truth thing happens by having web based OLTP type apps operating on Snowflake with Hybrid tables.

*Snowflake’s Native Apps announcements.  If Snowflake can get this right its a game changer for Data to Value and decreasing costs of deployment of Data Applications. 

*Streamlit integration into Snowflake.  Again, if Snowflake gets this right then it could be another Data to Value game-changer.  

***Also note, these 2 items above are not only that data “can” go to value faster, they also make the development of data apps and the combination of OLTP/OLAP applications much less costly and more achievable for “all” types of companies.  They could remove massive friction that exists with having to have massive high end full stack development.  Streamlit really is attempting to remove the Front-End and Middle Tier complexity from developing data applications.  (Aren’t most applications though data applications?).  Its really another low-code data development environment.

*Snowpipe streaming announcement.  (This was super interesting to me since I had worked with Issaic from Snowflake back before the 2019 Summit using the original Kafka to Snowflake Connector.  I also did a presentation on it at Snowflake Summit 2019.  It was awesome to see that Snowflake refactored the old Kafka connector and made it much faster with lower latency.  This again is another major win around Steaming Data to Value with an announced 10 times lower latency.  (Public Preview later in 2022)

*Snowpark for Python, Snowpark in general announcements.  This is really really new tech and the verdict is still out there but this is a major attempt by Snowflake to provide ML Pipeline Data to Value speed.  Snowflake is looking to have the full data event processing and Machine Learning processes all within Snowflake.

Summary

This article is part of my Frank’s Future of Data series I put together to prepare myself for taking advantage of new paradigms that the “Snowflake Data Cloud” and other “Modern Data Stack” tools/clouds provide.  Before I started my Snowflake Journey I was often speaking around the intersection of Data, Automation, and AI/ML.  I truly believe these forces have been changing our world everywhere and will continue to do so for many years.  Data to Value for me is a really key concept that helps me prioritize what provides value from our data related investments and work.

I hope you found this useful for thinking about your data initiatives.   Focusing specifically on Data to Value can help you prioritize and simplify what is truly most important for your organization!  Good Luck!

One Response

  1. Hi Frank – Great post. I’d be interested in hearing your thoughts on customers’ response to Unistore. Do they ‘get it’? And is there a mass rally around Unistore, or are customers still figuring out the maturity of the offering at this point? Thanks!

Leave a Reply

Snowflake Cost Saving

we automate snowflakeDB data cloud cost saving. sign our free 7 days no risk trail now