≡ Menu

Selecting a budgeting tool

For many organizations, Excel is still the main tool for supporting the budgeting and planning process.  Excel is very accessible and flexible, so there is a lot of freedom for the controllers in the departments to work with. The flexibility has a disadvantage as well. When the excel file becomes more complex, it is more likely a mistake can slip in. This error can accumulate and take  lot of time to figure out and correct. Especially when several departments submit an Excel file that then needs to be merged into one final overview.

When an organization realizes it takes more time to figure out these errors in Excel instead of doing proper analyses on the budgeting and business decisions it wants to make in the new year, it is time to look for a better solution.

In general the main reasons for choosing a budgeting tool to improve the budgeting process are:

  • Increase the effectiveness and efficiency of the budgeting process
  • Reduce the manual merging administration and error sensitivity of the process
  • Deliver correct and complete management information on time to the management and stakeholders
  • Increase the time to do analyses and make better business planning decisions

Last year I helped a Dutch University to select a budgeting tool. In this blog I like to capture some experiences I had during this process. Over the years I have worked in several roles with SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. And I have more than 10 years of experience in working in Higher Education organizations. In that sense I know what to expect in terms of requirements and functionality.

The general software selection process was applied to the project, that is:

  • Gathering requirements, functional and non-functional
  • Setting up a long list and condensing that to a short list
  • Getting suppliers questions, proposals, demo’s
  • Selecting and contracting

Gathering requirements, functional and non-functional

To gather the functional requirements in this project I interviewed 17 controllers and stakeholders. Their input was clustered into 15 main topics:

  • Consolidation and integration
  • Faculties and services planning capabilities
  • Personnel cost & projects planning
  • Plan versions management
  • Collaboration
  • Reporting
  • Workflow, process & audit
  • Flexibility
  • Smart
  • Users interface and user friendliness
  • Data integration
  • Managing data
  • General data modelling capabilities
  • Data attribute creation and maintenance
  • Data import, export and transformation (ETL in and out)

Given the state of modern technology, cloud software is preferred as a non-functional requirement. And obviously, security is an important topic, as well as privacy guidelines.

Setting up a long list and condensing that to a short list

To setup the long list the well-known researchers Gartner and BARC were used, together with a survey among other universities. This resulted in a huge list of almost 40 products. After an initial review of the list the number came down to 27. Some products were aimed at a local market, some products didn’t have a cloud option. These 27 products were looked into more closely, and this resulted in a shortlist of 8. As I had done the Open SAP course “Planning with SAP Analytics Cloud”,  this helped getting SAP on this list. But all of these tools looked very promising in terms of functionality.

The next steps in the purchasing process were pretty straightforward and completed at the end of 2021. A tool has been chosen and is at the moment, early 2022 being implemented. I am involved in this project as project leader and I will share my experiences in a future blog.

To conclude: It’s not necessary to do your budgeting process with Excel, there are some great tools out there!

Snowflake

This week I attended the hands-on sessions ‘Zero to Snowflake’ provided by Snowflake. One of the reasons for me to look into Snowflake is that the Erasmus University of Rotterdam (EUR ) is replacing their SAP BI stack with a combination of Matillion, Snowflake and Tableau.

Snowflake is a data platform in the Cloud. In their own words:

Quote:

Built from the ground up for the cloud, Snowflake’s unique multi-cluster shared data architecture delivers the performance, scale, elasticity, and concurrency today’s organizations require.

Snowflake is a single, integrated platform delivered as-a-service. It features storage, compute, and global services layers that are physically separated but logically integrated. Data workloads scale independently from one another, making it an ideal platform for data warehousing, data lakes, data engineering, data science, modern data sharing, and developing data applications.

Unquote.

The introduction I joined was a hands-on session, walking along a pdf with prepared sql statements. And even though it was scripted, I must say I am impressed by the tool.

Some key findings/learning I had:

  • I was told the tool is called ‘Snowflake’ because the 2 French starters loved skiing and it is an English word that they can pronounce.
  • The scalability is absolutely impressive, this can be done quickly and flexible.
  • The pricing is very transparent. You pay per second and it works with Credits. If you want to increase performance, for example van Small to Large, the speed doubles, and also the number of credits per hour doubles. As I remember correctly, a credit is about 5 USD.
  • The cloud location is a choice. So if you need to have your data within the EU, this can be arranged. Also the cloud platform (Amazone, Azure, Google) is a choice.
  • The cloud software will be updated regularly and is backwards compatible.
  • It had been more than 10 years for me to work in a SQL environment. I had forgotten how nice it is just to write some code, mark it, and execute it.
  • The data we worked with showed in a great way how you can combine several types of data sources and formats (csv and json).

All in all I got a great first impression of what this tool can do and how to use it.

As for a use case in a SAP BI environment, I can see some possibilities as well. I don’t want to go as far as to completely replace the complete SAP BI stack. However, sometimes users just want to have the data from the data-warehouse, to either make their own reports in non-SAP tools, or practice data-science using Python. A tool like Snowflake might be useful as layer between SAP BI and these tools. The advantages could be:

  • The data is still centralized, so the ‘single version of the truth’ can be maintained (as opposed to having several datasets going around into the organization)
  • Access to the data set can be done in an authorized, controlled environment
  • Data enrichment can be done initially flexible in this environment, without the need to load the data in SAP BI
  • In snowflake a lot is done with sql scripting, this is more natural for the data scientists who want to work with the data than getting to train them in SAP BI
  • The scalability of the tool is very flexible, the costs are per usage and not the number of users
  • It’s in the cloud, so for IT it is easier to maintain

To conclude: Snowflake is a great tool!

Planning with SAP Analytics Cloud

Recently I completed the Open SAP course “Planning with SAP Analytics Cloud”. I can really recommend this course because it gives a full overview and hands-on exercises for the main capabilities of SAC.

  • Analytics
  • Planning
  • Application building

The course focuses mainly on the ‘Planning’ capabilities of SAC, but the others topics are addressed as well, contributing to a full cycle and planning process.

I also used the option of opening a trial account and was able to do the hands-on exercises. I had worked before with SAC, preparing a demo. But now it started to make sense, whilst going through the motions. I feel more comfortable working with SAC now.

One of the most interesting features I found was the use of the predictive scenario in the planning process. In one of the exercises we used the actual data to train a Classification Prediction and then use the outcome as a planning version in the planning process. I think this can be definitely helpful in a planning and budgeting process.

Value-Driver tree

Another interesting feature is the Value-Driver tree:

With the Value-Driver tree template it is very easy to set up a value driver tree to do a high over and multiyear simulation.

Data actions

Data actions are basic functionalities in a planning & budgeting tool  and there were several exercises for those:

  • Copy: for example copy actuals to plan
  • Allocations: for example: allocate expenses between cost centers based on specific drivers
  • Advanced formulas: custom specific logic

The Copy and Allocations functions are quite easy to implement according to a basic template.

The advanced formulas are implemented with scripting, so there a lot of things are possible. The exercises gave a good impression of the possibilities and the scripting seemed well structured. Of course, with scripting it is possible to make very complex selections, so I cannot see how easy navigation or debugging goes with complex formulas.

System features

In the course several other features were shown as well concerning the process and flow like data locking, planning calendar, collaboration, audits and security. It seems that the tools offers everything that is needed.

Impression

All in all I am pretty impressed with the planning functionality and features that are available in SAC. I think for a central planning application it really makes sense to have it in the cloud. Typically the planning and budgeting phase is limited to a specific time period, and everybody starts working on it one day before the deadline, given a huge peak load of concurrent users for the system. In the cloud it is better scalable on those days.

I have worked over the years with several SAP planning, budgeting and consolidation tools like SAP BPC, SAP BW-IP and SEM BCS. But given all the features in place, the cloud advantage and the roadmap of SAP, I think this is the first tool to consider in an SAP ecosystem for planning and budgeting.

In conclusion: SAP Analytics Cloud is a great tool for planning and is my first tool of choice for planning and budgeting in an SAP ecosystem.

Composites in SAP Lumira designer

T

he composite functionality in SAP Lumira designer is great. To illustrate this, I would like to demonstrate the capabilities with an example of a report I build using composites.

In this example the financial department liked to see an actual/budget comparison for the different organizational units within the company. Based on sample data the draft in Excel looked like this.

The organizational hierarchy is based on profit centers. Per entity the actual and budget figures are shown. A signal indicator shows of the difference is within tolerance levels.

Within the reporting team are guidelines are to work with the IBCS guide book, so these rules were applied. So below some examples, taken from the website https://www.ibcs.com/.

Replace traffic lights:  In this case it was more informative to work with a bar chart, instead of traffic lights.

Embed chart elements in tables :  The given draft included table information and graphical elements, so these can be nicely combined.

Unify outlier indicators :Since it was expected in the graph that there would be big numbers and small number, it is necessary to handle toe outliers

Also an indicator was added to signal entities that need attention.

So after this discussions on the design, the new draft became:

In the previous version of Lumira Designer, SAP Design Studio, I would have asked a developer to build a SDK component that would give me this figure. With the composites option and  the new feature to iterate over a result set (see this blog), I can do this myself

The data in this case is provided by a single BW Bex query containing the columns: actual data, budget data, differences in % and absolute, graph color indicator, attention blue indicator. In the rows is one characteristic, the profit center hierarchy.

The composite is built with very simple components as text, panel and icons. Based on the row iterations on the data, a new row composite is created and filled with the relevant parameters for data and formatting.

The result looks like this:

It is clear that with the new functionalities BI developers have a lot more possibilities to create great visualizations without having to learn SDK skills. This is a lot easier for development and maintenance.

So in conclusion: Composites are a great new functionality in SAP Lumira Designer

SAP Lumira designer

SAP Lumira designer, the successor of SAP Design Studio, is on the market and in use for more than 2 years now. I have been using it for a while as well, and this is the tool I have been waiting for, ever since Design Studio came on the market. If you are looking to build a BI application, I don’t think there is a better tool. Especially on a SAP BW datawarehouse. there is so much possible with just the standard functionality! In this blog I want to focus on the positives and will write about the three best things I like.  As with everything, there are also some negatives, but I will save that for another time.

Number 1: Composites

With composites it is possible to make small building blocks, with a certain look and feel and functionality, and then use them as many times as you need to build up a full application. So in short, with composites, everything is possible. I learned a lot from this blog but there are many more with valuable insights. I have been using composites in the template report for a BI application, which helps to maintain the application easier. But it comes really into the spotlight when you can build custom made visuals based on just the data and programming logic. Also when used for a more operational type report with a lot of different panels and elements, it is really amazing to see the simplicity in coding you can achieve for so many complex operations. I made one of those operational reports, which had a sibling in Design Studio, and the way to build it is so much more cleaner, understandable and flexible. I really love this functionality.

Number 2: Bookmarking

One of the best liked features of the SAP BEx and SAP Design Studio tooling has always been to possibility of bookmarking. With bookmarking it is possible to capture the selections and navigation state in a report, save and use it for later purpose. For example for a monthly repeating report, users can just use the bookmark which had the settings and selections they liked, and refresh the data. In a way this is a kind of self service BI, hence it’s popularity. Bookmarks made by a user could be shared by sending an email with the bookmark link. In this new version, it is possible to build an application that shows all the available bookmarks there are, for a user, for the specific report or for certain groups. And it is possible to make ‘public’ bookmarks, which are automatically available for all users of the report they are related to. This makes it a lot easier for users to create and share specific states of a report. For users this functionality is the main reason why they are willing to do a conversion from Design Studio to Lumira Designer.

Number 3: Action sheet

With the action sheet component it is possible to create a nice menu item list that is linked to other components. It’s typical use is to have a menu area with action sheets linked to the icons. I think this is a nice component that is a user-friendly way of showing several menu items.

Honorable mention: Comment function

With the comment functionality It is possible to make context related notes, either for public or private usage. For example, a user can make a private reminder connected to a number which raises questions, or can make a public note about the same number with an explication why the number raises questions. This is a promising functionality, and although I have not implemented it fully, users respond well to the demonstrations.

To summarize: Lumira designer is the best tool for building a scripted BI application on a SAP BW datawarehouse.