≡ Menu

Snowflake

This week I attended the hands-on sessions ‘Zero to Snowflake’ provided by Snowflake. One of the reasons for me to look into Snowflake is that the Erasmus University of Rotterdam (EUR ) is replacing their SAP BI stack with a combination of Matillion, Snowflake and Tableau.

Snowflake is a data platform in the Cloud. In their own words:

Quote:

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture delivers the performance, scale, elasticity, and concurrency today’s organizations require.

Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute, and global services layers that are physically separated but logically integrated. Data workloads scale independently from one another, making it an ideal platform for data warehousing, data lakes, data engineering, data science, modern data sharing, and developing data applications.

Unquote.

The introduction I joined was a hands-on session, walking along a pdf with prepared sql statements. And even though it was scripted, I must say I am impressed by the tool.

Some key findings/learning I had:

  • I was told the tool is called ‘Snowflake’ because the 2 French starters loved skiing and it is an English word that they can pronounce.
  • The scalability is absolutely impressive, this can be done quickly and flexible.
  • The pricing is very transparent. You pay per second and it works with Credits. If you want to increase performance, for example van Small to Large, the speed doubles, and also the number of credits per hour doubles. As I remember correctly, a credit is about 5 USD.
  • The cloud location is a choice. So if you need to have your data within the EU, this can be arranged. Also the cloud platform (Amazone, Azure, Google) is a choice.
  • The cloud software will be updated regularly and is backwards compatible.
  • It had been more than 10 years for me to work in a SQL environment. I had forgotten how nice it is just to write some code, mark it, and execute it.
  • The data we worked with showed in a great way how you can combine several types of data sources and formats (csv and json).

All in all I got a great first impression of what this tool can do and how to use it.

As for a use case in a SAP BI environment, I can see some possibilities as well. I don’t want to go as far as to completely replace the complete SAP BI stack. However, sometimes users just want to have the data from the data-warehouse, to either make their own reports in non-SAP tools, or practice data-science using Python. A tool like Snowflake might be useful as layer between SAP BI and these tools. The advantages could be:

  • The data is still centralized, so the ‘single version of the truth’ can be maintained (as opposed to having several datasets going around into the organization)
  • Access to the data set can be done in an authorized, controlled environment
  • Data enrichment can be done initially flexible in this environment, without the need to load the data in SAP BI
  • In snowflake a lot is done with sql scripting, this is more natural for the data scientists who want to work with the data than getting to train them in SAP BI
  • The scalability of the tool is very flexible, the costs are per usage and not the number of users
  • It’s in the cloud, so for IT it is easier to maintain

To conclude: Snowflake is a great tool!

Planning with SAP Analytics Cloud

Recently I completed the Open SAP course “Planning with SAP Analytics Cloud”. I can really recommend this course because it gives a full overview and hands-on exercises for the main capabilities of SAC.

  • Analytics
  • Planning
  • Application building

The course focuses mainly on the ‘Planning’ capabilities of SAC, but the others topics are addressed as well, contributing to a full cycle and planning process.

I also used the option of opening a trial account and was able to do the hands-on exercises. I had worked before with SAC, preparing a demo. But now it started to make sense, whilst going through the motions. I feel more comfortable working with SAC now.

One of the most interesting features I found was the use of the predictive scenario in the planning process. In one of the exercises we used the actual data to train a Classification Prediction and then use the outcome as a planning version in the planning process. I think this can be definitely helpful in a planning and budgeting process.

Value-Driver tree

Another interesting feature is the Value-Driver tree:

With the Value-Driver tree template it is very easy to set up a value driver tree to do a high over and multiyear simulation.

Data actions

Data actions are basic functionalities in a planning & budgeting tool  and there were several exercises for those:

  • Copy: for example copy actuals to plan
  • Allocations: for example: allocate expenses between cost centers based on specific drivers
  • Advanced formulas: custom specific logic

The Copy and Allocations functions are quite easy to implement according to a basic template.

The advanced formulas are implemented with scripting, so there a lot of things are possible. The exercises gave a good impression of the possibilities and the scripting seemed well structured. Of course, with scripting it is possible to make very complex selections, so I cannot see how easy navigation or debugging goes with complex formulas.

System features

In the course several other features were shown as well concerning the process and flow like data locking, planning calendar, collaboration, audits and security. It seems that the tools offers everything that is needed.

Impression

All in all I am pretty impressed with the planning functionality and features that are available in SAC. I think for a central planning application it really makes sense to have it in the cloud. Typically the planning and budgeting phase is limited to a specific time period, and everybody starts working on it one day before the deadline, given a huge peak load of concurrent users for the system. In the cloud it is better scalable on those days.

I have worked over the years with several SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. But given all the features in place, the cloud advantage and the roadmap of SAP, I think this is the first tool to consider in an SAP ecosystem for planning and budgeting.

In conclusion: SAP Analytics Cloud is a great tool for planning and is my first tool of choice for planning and budgeting in an SAP ecosystem.

Composites in SAP Lumira designer

T

he composite functionality in SAP Lumira designer is great. To illustrate this, I would like to demonstrate the capabilities with an example of a report I build using composites.

In this example the financial department liked to see an actual/budget comparison for the different organizational units within the company. Based on sample data the draft in Excel looked like this.

The organizational hierarchy is based on profit centers. Per entity the actual and budget figures are shown. A signal indicator shows of the difference is within tolerance levels.

Within the reporting team are guidelines are to work with the IBCS guide book, so these rules were applied. So below some examples, taken from the website https://www.ibcs.com/.

Replace traffic lights:  In this case it was more informative to work with a bar chart, instead of traffic lights.

Embed chart elements in tables :  The given draft included table information and graphical elements, so these can be nicely combined.

Unify outlier indicators :Since it was expected in the graph that there would be big numbers and small number, it is necessary to handle toe outliers

Also an indicator was added to signal entities that need attention.

So after this discussions on the design, the new draft became:

In the previous version of Lumira Designer, SAP Design Studio, I would have asked a developer to build a SDK component that would give me this figure. With the composites option and  the new feature to iterate over a result set (see this blog), I can do this myself

The data in this case is provided by a single BW Bex query containing the columns: actual data, budget data, differences in % and absolute, graph color indicator, attention blue indicator. In the rows is one characteristic, the profit center hierarchy.

The composite is built with very simple components as text, panel and icons. Based on the row iterations on the data, a new row composite is created and filled with the relevant parameters for data and formatting.

The result looks like this:

It is clear that with the new functionalities BI developers have a lot more possibilities to create great visualizations without having to learn SDK skills. This is a lot easier for development and maintenance.

So in conclusion: Composites are a great new functionality in SAP Lumira Designer

SAP Lumira designer

SAP Lumira designer, the successor of SAP Design Studio, is on the market and in use for more than 2 years now. I have been using it for a while as well, and this is the tool I have been waiting for, ever since Design Studio came on the market. If you are looking to build a BI application, I don’t think there is a better tool. Especially on a SAP BW datawarehouse. there is so much possible with just the standard functionality! In this blog I want to focus on the positives and will write about the three best things I like.  As with everything, there are also some negatives, but I will save that for another time.

Number 1: Composites

With composites it is possible to make small building blocks, with a certain look and feel and functionality, and then use them as many times as you need to build up a full application. So in short, with composites, everything is possible. I learned a lot from this blog but there are many more with valuable insights. I have been using composites in the template report for a BI application, which helps to maintain the application easier. But it comes really into the spotlight when you can build custom made visuals based on just the data and programming logic. Also when used for a more operational type report with a lot of different panels and elements, it is really amazing to see the simplicity in coding you can achieve for so many complex operations. I made one of those operational reports, which had a sibling in Design Studio, and the way to build it is so much more cleaner, understandable and flexible. I really love this functionality.

Number 2: Bookmarking

One of the best liked features of the SAP BEx and SAP Design Studio tooling has always been to possibility of bookmarking. With bookmarking it is possible to capture the selections and navigation state in a report, save and use it for later purpose. For example for a monthly repeating report, users can just use the bookmark which had the settings and selections they liked, and refresh the data. In a way this is a kind of self service BI, hence it’s popularity. Bookmarks made by a user could be shared by sending an email with the bookmark link. In this new version, it is possible to build an application that shows all the available bookmarks there are, for a user, for the specific report or for certain groups. And it is possible to make ‘public’ bookmarks, which are automatically available for all users of the report they are related to. This makes it a lot easier for users to create and share specific states of a report. For users this functionality is the main reason why they are willing to do a conversion from Design Studio to Lumira Designer.

Number 3: Action sheet

With the action sheet component it is possible to create a nice menu item list that is linked to other components. It’s typical use is to have a menu area with action sheets linked to the icons. I think this is a nice component that is a user-friendly way of showing several menu items.

Honorable mention: Comment function

With the comment functionality It is possible to make context related notes, either for public or private usage. For example, a user can make a private reminder connected to a number which raises questions, or can make a public note about the same number with an explication why the number raises questions. This is a promising functionality, and although I have not implemented it fully, users respond well to the demonstrations.

To summarize: Lumira designer is the best tool for building a scripted BI application on a SAP BW datawarehouse.

BI Summit 2019 Dutch Universities

T

his year the BI Summit of the Dutch universities was held at the TUE (13 th September 2019). This was already the third edition, and it seems that every year it is growing in size and becoming more professional. There were a lot of interesting presentations and in this blog I would like to highlight three of them:

  • ‘Reinout van Brakel – Sharing insights across Universities’; for showing the ideas and principles behind the sector dashboard
  • ‘Comprehensive forecasting model – BI-Cluster of the TU/e’; for showing the build forecast model with different scenario options
  • ‘Piet Daas – Big Data meets official statistics’; for the inspirational demonstrations of use of ‘big’ data to gain insights

I start with the presentation of ‘Reinout van Brakel – Sharing insights across Universities’ of the VSNU ( the publics affair agency for Dutch universities).  He explained the role of the VSNU in Dutch policy making, and how over the years they have been facilitating gathering facts and figures from the Universities, in order to be transparent over the achievements of the Universities. For individual Universities this is also highly valuable information, since they can see (on an aggregated level) information of other higher educational organizations. Thus allowing them to benchmark (i.e. on personnel), view market share (i.e on a certain teaching program) and connection (i.e. first years masters origin). In 2018 it was decided to build a dashboard on this information (in Tableau), so now the data is easily accessible. Reminder to myself: check regularly their dashboards, it might be inspiring and also, the data should be alike.

The presentation of the TU/e, ‘Comprehensive forecasting model – BI-Cluster of the TU/e’ was about a lot more then just the forecasting model. They also explained how their BI Cluster is delivering value to the organization. It is quite a remarkable achievement that within several years they have a professional BI operation running that is delivering BI products to the organization. The second part was mainly on the forecasting model, which they build in Power BI. For me it was inspiring to see what the principals behind it are. Reminder to myself: check if this can add value at my current BI project as well.

The presentation ‘Piet Daas – Big Data meets official statistics’, of the Dutch central bureau of Statistics (CBS) was very inspiring in the way it showed how to use big data. The CBS has access (with limitations of course) to a lot of data like road sensors, phone location data, ship locations to name a few. With this data they can gain very interesting insight to improve decision making in the Netherlands. Reminder to myself: keep an eye on interesting reports from the CBS.

To conclude: A very interesting BI Summit this year. Check the VSNU dashboard regularly. Check if there is a requirement for a forecasting model like the TU/e has. Keep an eye out for interesting CBS reports.