≡ Menu

Which is the best BI tool to use, SAC or PowerBI?

This is a question that has been bothering me the last year. I don’t really have a straight answer but I do have some thoughts on this that I would like to share in this blog.

My answer to the question, which is the best BI tool to use, SAC or Power BI would be: ‘Probably PowerBI’.

The reasons for this are:

  • It is easier to start and get access to the tool: users probably already have a PowerBI license through their office subscription, and they can work with the offline version on their laptop. With SAC they will always need to work in the cloud, and they will need to do an effort to obtain a license.
  • There is a huge PowerBI community for support and answering questions. The SAP community is smaller and is held back by the bad platform design that was installed several years ago.
  • There are plenty of features in PowerBI to make good looking reports.

With PowerBI, there will be several business users who will be able to make great reports that will help them in their work.

Of course, for businesses that rely heavily on SAP software, SAC will always be a serious option to consider. Personally, I think that SAC will always be a tool that will be used by SAP BI consultants to make high quality, single version of the truth, reports for the business. Reports that make up together a BI application. The reason for this is that the learning curve of SAC is pretty steep, and is more easily learned by SAP consultants, who will be able to make full use of its potential.

SAC will have its unique selling point once this BI application building has reached the same level as SAP Lumira Designer.

SAP is working towards that, but they are still not there. In the meantime, SAP customers should not bother too much with replacing their current SAP BI solution with SAC. The business users will be more interested in working with PowerBI. But in the long run SAC will catch up, and then it will be the main trusted source of BI information.

To conclude: Probably for now best to use PowerBI, SAC will be the better choice for SAP businesses in the long run.

Budgeting for higher education

Last year I helped as a projectmanager the Erasmus University (EUR) with the implementation of a budgeting tool. In this linkedin message (Dutch) I announced the ending of my assignment.

The goal of the implementation of a budgeting tool was to make the EUR more agile and effective by having a more streamlined budgeting process.

Using a budgeting software tool supports this goal for three reasons.

1. Better quality of the budget

  • With an integrated database and budgeting rules there are no errors in the data caused by Excel formulas and manual errors.
  • With an centralized integrated database the intercompany settlements are always in check. Also the communication about the settlements is improved and better documented with an approval process.
  • By using a workflow there is more transparency in the proces

2. More possibilities

  • With a tool there is more automation which helps making it easier to make a budget and thus the frequency can increase. Some faculties wanted to do a monthly prognose, instead of quarterly. This gives them a better insight into their financial situation. With a tool they can chose to do this.
  • The centralized integrated database also helps in making better analysis of the budgeting data. And sooner in the process as well. There is no need to ask faculties to send in their data, it is already there! So the central department can help and advise on the best way to budget.
  • With a tool like this it is easier to do several scenarios planning for strategic purposes.

3. User friendliness

  • A big advantage of tool like this is the centralized and automated way of retrieving the financial and personnel data from the administrative source system. It is not necessary to do manual downloads and uploads anymore.
  • Based on the data, the tool can help making predictions on the staff within departments. Retirement information, end of contracts, automatic salary increasements based on number of years in a role, all of these help in setting up a base for the controllers to plan on their staff costs.
  • Better integration of the data within the faculties. Within a faculty there are also interfaculty settlements that are automatically in balance.
  • Within the tool automated consolidation on different levels (interfaculty, intercompany, with external parties) speeds up the reporting on the budget.
  • User friendly functions are available, like copy and distribution of data.

The result

During the project we started a budgeting cycle and used that to improve the functionality of the tool. Based on that experience, there is some insight in how well the benefits materialized.

The start of using the tool was rusty. Due to the pressure of the budgeting cycle timeline, not all functionalities had been implemented at the start, only the most essential. For example the internal settlements were in place, but not fully developed yet. And also some technical issues happened on the software supplier cloud environment. During the process, and thanks to the dedication of the EUR controllers, the issues were all dealt with in a satisfying manner.

Benefits strong for centralized controlling 

After overcoming these difficulties, the benefits started to appear. The intercompany and interfaculty settlement process used to be a source for a lot of errors, not anymore. The integrated centralized database made it possible to communicate about the same figures for help and guidelines clarifications. The workflow helped in giving insight in the status, who is working or reviewing a particular budget. This all helped improve the process considerably, certainly from a central perspective.

Mixed bag for the decentralized controlling

From a more decentralized perspective, the services and faculties,  it was more of a mixed bag. Some of these users had built Excel files over the years that are a system in itself and which are custom made for their needs. The new tool is thus felt as a step backwards when it comes to usability and user friendliness.  During the project we spend a lot of time to improve the tool to make sure that for those users the benefits would also be strong. By the time my assignment ended, the tool was used intensely, but there were also on local level still Excel files being used for specific topics. Hopefully within time and some improvements there will be less need to use these.

All in all, based on the reactions of the users, the benefits are recognized and this tool will help the Erasmus in the future.

To conclude: a budgeting tool has a lot of benefits and improves the budgeting process 

Presenting data effectively

The previous week I went to the training ‘Presenting Data Effectively’ with Stephanie Evergreen in Amsterdam. For almost 5 years I have been subscribed to her newsletter from the website https://stephanieevergreen.com. I often have been inspired by the letters she writes. When I read in her newsletter she would be in Amsterdam to give training, organized by Goof from https://graphichunters.nl/, I definitely wanted to go. I was a bit starstruck to be honest.

I loved the training.

Stephanie is a really good teacher with great teaching skills who has a lot of knowledge and background in the science of presenting data.

I really enjoyed using the 4 step process to make the data visualization development a structured activity. Start with ‘What’s the point?’. Move to ‘Who is the audience and how will this be delivered to them?’ From there to ‘What is the best chart type?’. And finally end it with ‘How can you sharpen the point?’.

With step 3 came an Excel with sample data and we learned how to create some impressing graphs using Excel. I found the proportion plot the most impressive.

Another very useful insight concerned step 2, ‘ Who is the audience?’. The audience is somewhere on  a line. On one extreme side, the audience wants to see clear graphs, that points straight to the conclusion. If the graph is complicated and they don’t understand it, they won’t trust it. On the other extreme side, the audience likes to see the nuance in the graph. If the graph points too directly to the conclusions, they won’t trust it. So knowing your audience is important in choosing a graph.

In the end, I felt my biggest take of the course was step 4 ‘How can you sharpen the point?’. Too often it is assumed that the data speaks for itself, but that is usually not the case. Tell what the graph is about. Point at what is interesting. Don’t let the viewer search for meaning in the graph.

Last week I had to give an update on the project I am a project manager for. It was about the items we took in after going live, and how much of these items are solved. All the questions and comments were noted down after going live with the project. Some of them were really important, and we tried to solve those as quickly as possible. Others are just noted and will be answered later.

Since it was about ‘Hey, things changed over time’. I check the possible graphs to tell the story of the progress we made. I ended up using the accumulated numbers of total items and solved items. I started with a line graph, but I wanted to put more weight on the solved items, so I made those columns. I increased the width of the columns and cleaned up the graph. In the end, it was obvious that there is still a big gap between what we solved and what is still open. So then it was time to ‘sharpen the point’. I stated in the title the point. Of the open items, 11 are issues of which 7 are issues with a high priority. The numbers were formatted in bold, to emphasize on the 11 and 7. This to leave the message in the mind, that 7 is the number to focus on.

This is how the graph turned out.

The day after, in the Q&A session, the focus was on the progress made on these 7 items. Instead of all the other open questions still remaining. Which was exactly the message I wanted to get across.

I felt my training day was a day well spend.

To conclude: Stephanie Rocks!

Selecting a budgeting tool

For many organizations, Excel is still the main tool for supporting the budgeting and planning process.  Excel is very accessible and flexible, so there is a lot of freedom for the controllers in the departments to work with. The flexibility has a disadvantage as well. When the excel file becomes more complex, it is more likely a mistake can slip in. This error can accumulate and take  lot of time to figure out and correct. Especially when several departments submit an Excel file that then needs to be merged into one final overview.

When an organization realizes it takes more time to figure out these errors in Excel instead of doing proper analyses on the budgeting and business decisions it wants to make in the new year, it is time to look for a better solution.

In general the main reasons for choosing a budgeting tool to improve the budgeting process are:

  • Increase the effectiveness and efficiency of the budgeting process
  • Reduce the manual merging administration and error sensitivity of the process
  • Deliver correct and complete management information on time to the management and stakeholders
  • Increase the time to do analyses and make better business planning decisions

Last year I helped a Dutch University to select a budgeting tool. In this blog I like to capture some experiences I had during this process. Over the years I have worked in several roles with SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. And I have more than 10 years of experience in working in Higher Education organizations. In that sense I know what to expect in terms of requirements and functionality.

The general software selection process was applied to the project, that is:

  • Gathering requirements, functional and non-functional
  • Setting up a long list and condensing that to a short list
  • Getting suppliers questions, proposals, demo’s
  • Selecting and contracting

Gathering requirements, functional and non-functional

To gather the functional requirements in this project I interviewed 17 controllers and stakeholders. Their input was clustered into 15 main topics:

  • Consolidation and integration
  • Faculties and services planning capabilities
  • Personnel cost & projects planning
  • Plan versions management
  • Collaboration
  • Reporting
  • Workflow, process & audit
  • Flexibility
  • Smart
  • Users interface and user friendliness
  • Data integration
  • Managing data
  • General data modelling capabilities
  • Data attribute creation and maintenance
  • Data import, export and transformation (ETL in and out)

Given the state of modern technology, cloud software is preferred as a non-functional requirement. And obviously, security is an important topic, as well as privacy guidelines.

Setting up a long list and condensing that to a short list

To setup the long list the well-known researchers Gartner and BARC were used, together with a survey among other universities. This resulted in a huge list of almost 40 products. After an initial review of the list the number came down to 27. Some products were aimed at a local market, some products didn’t have a cloud option. These 27 products were looked into more closely, and this resulted in a shortlist of 8. As I had done the Open SAP course “Planning with SAP Analytics Cloud”,  this helped getting SAP on this list. But all of these tools looked very promising in terms of functionality.

The next steps in the purchasing process were pretty straightforward and completed at the end of 2021. A tool has been chosen and is at the moment, early 2022 being implemented. I am involved in this project as project leader and I will share my experiences in a future blog.

To conclude: It’s not necessary to do your budgeting process with Excel, there are some great tools out there!

Snowflake

This week I attended the hands-on sessions ‘Zero to Snowflake’ provided by Snowflake. One of the reasons for me to look into Snowflake is that the Erasmus University of Rotterdam (EUR ) is replacing their SAP BI stack with a combination of Matillion, Snowflake and Tableau.

Snowflake is a data platform in the Cloud. In their own words:

Quote:

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture delivers the performance, scale, elasticity, and concurrency today’s organizations require.

Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute, and global services layers that are physically separated but logically integrated. Data workloads scale independently from one another, making it an ideal platform for data warehousing, data lakes, data engineering, data science, modern data sharing, and developing data applications.

Unquote.

The introduction I joined was a hands-on session, walking along a pdf with prepared sql statements. And even though it was scripted, I must say I am impressed by the tool.

Some key findings/learning I had:

  • I was told the tool is called ‘Snowflake’ because the 2 French starters loved skiing and it is an English word that they can pronounce.
  • The scalability is absolutely impressive, this can be done quickly and flexible.
  • The pricing is very transparent. You pay per second and it works with Credits. If you want to increase performance, for example van Small to Large, the speed doubles, and also the number of credits per hour doubles. As I remember correctly, a credit is about 5 USD.
  • The cloud location is a choice. So if you need to have your data within the EU, this can be arranged. Also the cloud platform (Amazone, Azure, Google) is a choice.
  • The cloud software will be updated regularly and is backwards compatible.
  • It had been more than 10 years for me to work in a SQL environment. I had forgotten how nice it is just to write some code, mark it, and execute it.
  • The data we worked with showed in a great way how you can combine several types of data sources and formats (csv and json).

All in all I got a great first impression of what this tool can do and how to use it.

As for a use case in a SAP BI environment, I can see some possibilities as well. I don’t want to go as far as to completely replace the complete SAP BI stack. However, sometimes users just want to have the data from the data-warehouse, to either make their own reports in non-SAP tools, or practice data-science using Python. A tool like Snowflake might be useful as layer between SAP BI and these tools. The advantages could be:

  • The data is still centralized, so the ‘single version of the truth’ can be maintained (as opposed to having several datasets going around into the organization)
  • Access to the data set can be done in an authorized, controlled environment
  • Data enrichment can be done initially flexible in this environment, without the need to load the data in SAP BI
  • In snowflake a lot is done with sql scripting, this is more natural for the data scientists who want to work with the data than getting to train them in SAP BI
  • The scalability of the tool is very flexible, the costs are per usage and not the number of users
  • It’s in the cloud, so for IT it is easier to maintain

To conclude: Snowflake is a great tool!