≡ Menu

Presenting data effectively

The previous week I went to the training ‘Presenting Data Effectively’ with Stephanie Evergreen in Amsterdam. For almost 5 years I have been subscribed to her newsletter from the website https://stephanieevergreen.com. I often have been inspired by the letters she writes. When I read in her newsletter she would be in Amsterdam to give training, organized by Goof from https://graphichunters.nl/, I definitely wanted to go. I was a bit starstruck to be honest.

I loved the training.

Stephanie is a really good teacher with great teaching skills who has a lot of knowledge and background in the science of presenting data.

I really enjoyed using the 4 step process to make the data visualization development a structured activity. Start with ‘What’s the point?’. Move to ‘Who is the audience and how will this be delivered to them?’ From there to ‘What is the best chart type?’. And finally end it with ‘How can you sharpen the point?’.

With step 3 came an Excel with sample data and we learned how to create some impressing graphs using Excel. I found the proportion plot the most impressive.

Another very useful insight concerned step 2, ‘ Who is the audience?’. The audience is somewhere on  a line. On one extreme side, the audience wants to see clear graphs, that points straight to the conclusion. If the graph is complicated and they don’t understand it, they won’t trust it. On the other extreme side, the audience likes to see the nuance in the graph. If the graph points too directly to the conclusions, they won’t trust it. So knowing your audience is important in choosing a graph.

In the end, I felt my biggest take of the course was step 4 ‘How can you sharpen the point?’. Too often it is assumed that the data speaks for itself, but that is usually not the case. Tell what the graph is about. Point at what is interesting. Don’t let the viewer search for meaning in the graph.

Last week I had to give an update on the project I am a project manager for. It was about the items we took in after going live, and how much of these items are solved. All the questions and comments were noted down after going live with the project. Some of them were really important, and we tried to solve those as quickly as possible. Others are just noted and will be answered later.

Since it was about ‘Hey, things changed over time’. I check the possible graphs to tell the story of the progress we made. I ended up using the accumulated numbers of total items and solved items. I started with a line graph, but I wanted to put more weight on the solved items, so I made those columns. I increased the width of the columns and cleaned up the graph. In the end, it was obvious that there is still a big gap between what we solved and what is still open. So then it was time to ‘sharpen the point’. I stated in the title the point. Of the open items, 11 are issues of which 7 are issues with a high priority. The numbers were formatted in bold, to emphasize on the 11 and 7. This to leave the message in the mind, that 7 is the number to focus on.

This is how the graph turned out.

The day after, in the Q&A session, the focus was on the progress made on these 7 items. Instead of all the other open questions still remaining. Which was exactly the message I wanted to get across.

I felt my training day was a day well spend.

To conclude: Stephanie Rocks!

Selecting a budgeting tool

For many organizations, Excel is still the main tool for supporting the budgeting and planning process.  Excel is very accessible and flexible, so there is a lot of freedom for the controllers in the departments to work with. The flexibility has a disadvantage as well. When the excel file becomes more complex, it is more likely a mistake can slip in. This error can accumulate and take  lot of time to figure out and correct. Especially when several departments submit an Excel file that then needs to be merged into one final overview.

When an organization realizes it takes more time to figure out these errors in Excel instead of doing proper analyses on the budgeting and business decisions it wants to make in the new year, it is time to look for a better solution.

In general the main reasons for choosing a budgeting tool to improve the budgeting process are:

  • Increase the effectiveness and efficiency of the budgeting process
  • Reduce the manual merging administration and error sensitivity of the process
  • Deliver correct and complete management information on time to the management and stakeholders
  • Increase the time to do analyses and make better business planning decisions

Last year I helped a Dutch University to select a budgeting tool. In this blog I like to capture some experiences I had during this process. Over the years I have worked in several roles with SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. And I have more than 10 years of experience in working in Higher Education organizations. In that sense I know what to expect in terms of requirements and functionality.

The general software selection process was applied to the project, that is:

  • Gathering requirements, functional and non-functional
  • Setting up a long list and condensing that to a short list
  • Getting suppliers questions, proposals, demo’s
  • Selecting and contracting

Gathering requirements, functional and non-functional

To gather the functional requirements in this project I interviewed 17 controllers and stakeholders. Their input was clustered into 15 main topics:

  • Consolidation and integration
  • Faculties and services planning capabilities
  • Personnel cost & projects planning
  • Plan versions management
  • Collaboration
  • Reporting
  • Workflow, process & audit
  • Flexibility
  • Smart
  • Users interface and user friendliness
  • Data integration
  • Managing data
  • General data modelling capabilities
  • Data attribute creation and maintenance
  • Data import, export and transformation (ETL in and out)

Given the state of modern technology, cloud software is preferred as a non-functional requirement. And obviously, security is an important topic, as well as privacy guidelines.

Setting up a long list and condensing that to a short list

To setup the long list the well-known researchers Gartner and BARC were used, together with a survey among other universities. This resulted in a huge list of almost 40 products. After an initial review of the list the number came down to 27. Some products were aimed at a local market, some products didn’t have a cloud option. These 27 products were looked into more closely, and this resulted in a shortlist of 8. As I had done the Open SAP course “Planning with SAP Analytics Cloud”,  this helped getting SAP on this list. But all of these tools looked very promising in terms of functionality.

The next steps in the purchasing process were pretty straightforward and completed at the end of 2021. A tool has been chosen and is at the moment, early 2022 being implemented. I am involved in this project as project leader and I will share my experiences in a future blog.

To conclude: It’s not necessary to do your budgeting process with Excel, there are some great tools out there!

Snowflake

This week I attended the hands-on sessions ‘Zero to Snowflake’ provided by Snowflake. One of the reasons for me to look into Snowflake is that the Erasmus University of Rotterdam (EUR ) is replacing their SAP BI stack with a combination of Matillion, Snowflake and Tableau.

Snowflake is a data platform in the Cloud. In their own words:

Quote:

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture delivers the performance, scale, elasticity, and concurrency today’s organizations require.

Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute, and global services layers that are physically separated but logically integrated. Data workloads scale independently from one another, making it an ideal platform for data warehousing, data lakes, data engineering, data science, modern data sharing, and developing data applications.

Unquote.

The introduction I joined was a hands-on session, walking along a pdf with prepared sql statements. And even though it was scripted, I must say I am impressed by the tool.

Some key findings/learning I had:

  • I was told the tool is called ‘Snowflake’ because the 2 French starters loved skiing and it is an English word that they can pronounce.
  • The scalability is absolutely impressive, this can be done quickly and flexible.
  • The pricing is very transparent. You pay per second and it works with Credits. If you want to increase performance, for example van Small to Large, the speed doubles, and also the number of credits per hour doubles. As I remember correctly, a credit is about 5 USD.
  • The cloud location is a choice. So if you need to have your data within the EU, this can be arranged. Also the cloud platform (Amazone, Azure, Google) is a choice.
  • The cloud software will be updated regularly and is backwards compatible.
  • It had been more than 10 years for me to work in a SQL environment. I had forgotten how nice it is just to write some code, mark it, and execute it.
  • The data we worked with showed in a great way how you can combine several types of data sources and formats (csv and json).

All in all I got a great first impression of what this tool can do and how to use it.

As for a use case in a SAP BI environment, I can see some possibilities as well. I don’t want to go as far as to completely replace the complete SAP BI stack. However, sometimes users just want to have the data from the data-warehouse, to either make their own reports in non-SAP tools, or practice data-science using Python. A tool like Snowflake might be useful as layer between SAP BI and these tools. The advantages could be:

  • The data is still centralized, so the ‘single version of the truth’ can be maintained (as opposed to having several datasets going around into the organization)
  • Access to the data set can be done in an authorized, controlled environment
  • Data enrichment can be done initially flexible in this environment, without the need to load the data in SAP BI
  • In snowflake a lot is done with sql scripting, this is more natural for the data scientists who want to work with the data than getting to train them in SAP BI
  • The scalability of the tool is very flexible, the costs are per usage and not the number of users
  • It’s in the cloud, so for IT it is easier to maintain

To conclude: Snowflake is a great tool!

Planning with SAP Analytics Cloud

Recently I completed the Open SAP course “Planning with SAP Analytics Cloud”. I can really recommend this course because it gives a full overview and hands-on exercises for the main capabilities of SAC.

  • Analytics
  • Planning
  • Application building

The course focuses mainly on the ‘Planning’ capabilities of SAC, but the others topics are addressed as well, contributing to a full cycle and planning process.

I also used the option of opening a trial account and was able to do the hands-on exercises. I had worked before with SAC, preparing a demo. But now it started to make sense, whilst going through the motions. I feel more comfortable working with SAC now.

One of the most interesting features I found was the use of the predictive scenario in the planning process. In one of the exercises we used the actual data to train a Classification Prediction and then use the outcome as a planning version in the planning process. I think this can be definitely helpful in a planning and budgeting process.

Value-Driver tree

Another interesting feature is the Value-Driver tree:

With the Value-Driver tree template it is very easy to set up a value driver tree to do a high over and multiyear simulation.

Data actions

Data actions are basic functionalities in a planning & budgeting tool  and there were several exercises for those:

  • Copy: for example copy actuals to plan
  • Allocations: for example: allocate expenses between cost centers based on specific drivers
  • Advanced formulas: custom specific logic

The Copy and Allocations functions are quite easy to implement according to a basic template.

The advanced formulas are implemented with scripting, so there a lot of things are possible. The exercises gave a good impression of the possibilities and the scripting seemed well structured. Of course, with scripting it is possible to make very complex selections, so I cannot see how easy navigation or debugging goes with complex formulas.

System features

In the course several other features were shown as well concerning the process and flow like data locking, planning calendar, collaboration, audits and security. It seems that the tools offers everything that is needed.

Impression

All in all I am pretty impressed with the planning functionality and features that are available in SAC. I think for a central planning application it really makes sense to have it in the cloud. Typically the planning and budgeting phase is limited to a specific time period, and everybody starts working on it one day before the deadline, given a huge peak load of concurrent users for the system. In the cloud it is better scalable on those days.

I have worked over the years with several SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. But given all the features in place, the cloud advantage and the roadmap of SAP, I think this is the first tool to consider in an SAP ecosystem for planning and budgeting.

In conclusion: SAP Analytics Cloud is a great tool for planning and is my first tool of choice for planning and budgeting in an SAP ecosystem.

Composites in SAP Lumira designer

T

he composite functionality in SAP Lumira designer is great. To illustrate this, I would like to demonstrate the capabilities with an example of a report I build using composites.

In this example the financial department liked to see an actual/budget comparison for the different organizational units within the company. Based on sample data the draft in Excel looked like this.

The organizational hierarchy is based on profit centers. Per entity the actual and budget figures are shown. A signal indicator shows of the difference is within tolerance levels.

Within the reporting team are guidelines are to work with the IBCS guide book, so these rules were applied. So below some examples, taken from the website https://www.ibcs.com/.

Replace traffic lights:  In this case it was more informative to work with a bar chart, instead of traffic lights.

Embed chart elements in tables :  The given draft included table information and graphical elements, so these can be nicely combined.

Unify outlier indicators :Since it was expected in the graph that there would be big numbers and small number, it is necessary to handle toe outliers

Also an indicator was added to signal entities that need attention.

So after this discussions on the design, the new draft became:

In the previous version of Lumira Designer, SAP Design Studio, I would have asked a developer to build a SDK component that would give me this figure. With the composites option and  the new feature to iterate over a result set (see this blog), I can do this myself

The data in this case is provided by a single BW Bex query containing the columns: actual data, budget data, differences in % and absolute, graph color indicator, attention blue indicator. In the rows is one characteristic, the profit center hierarchy.

The composite is built with very simple components as text, panel and icons. Based on the row iterations on the data, a new row composite is created and filled with the relevant parameters for data and formatting.

The result looks like this:

It is clear that with the new functionalities BI developers have a lot more possibilities to create great visualizations without having to learn SDK skills. This is a lot easier for development and maintenance.

So in conclusion: Composites are a great new functionality in SAP Lumira Designer