≡ Menu

BI Summit 2019 Dutch Universities

T

his year the BI Summit of the Dutch universities was held at the TUE (13 th September 2019). This was already the third edition, and it seems that every year it is growing in size and becoming more professional. There were a lot of interesting presentations and in this blog I would like to highlight three of them:

  • ‘Reinout van Brakel – Sharing insights across Universities’; for showing the ideas and principles behind the sector dashboard
  • ‘Comprehensive forecasting model – BI-Cluster of the TU/e’; for showing the build forecast model with different scenario options
  • ‘Piet Daas – Big Data meets official statistics’; for the inspirational demonstrations of use of ‘big’ data to gain insights

I start with the presentation of ‘Reinout van Brakel – Sharing insights across Universities’ of the VSNU ( the publics affair agency for Dutch universities).  He explained the role of the VSNU in Dutch policy making, and how over the years they have been facilitating gathering facts and figures from the Universities, in order to be transparent over the achievements of the Universities. For individual Universities this is also highly valuable information, since they can see (on an aggregated level) information of other higher educational organizations. Thus allowing them to benchmark (i.e. on personnel), view market share (i.e on a certain teaching program) and connection (i.e. first years masters origin). In 2018 it was decided to build a dashboard on this information (in Tableau), so now the data is easily accessible. Reminder to myself: check regularly their dashboards, it might be inspiring and also, the data should be alike.

The presentation of the TU/e, ‘Comprehensive forecasting model – BI-Cluster of the TU/e’ was about a lot more then just the forecasting model. They also explained how their BI Cluster is delivering value to the organization. It is quite a remarkable achievement that within several years they have a professional BI operation running that is delivering BI products to the organization. The second part was mainly on the forecasting model, which they build in Power BI. For me it was inspiring to see what the principals behind it are. Reminder to myself: check if this can add value at my current BI project as well.

The presentation ‘Piet Daas – Big Data meets official statistics’, of the Dutch central bureau of Statistics (CBS) was very inspiring in the way it showed how to use big data. The CBS has access (with limitations of course) to a lot of data like road sensors, phone location data, ship locations to name a few. With this data they can gain very interesting insight to improve decision making in the Netherlands. Reminder to myself: keep an eye on interesting reports from the CBS.

To conclude: A very interesting BI Summit this year. Check the VSNU dashboard regularly. Check if there is a requirement for a forecasting model like the TU/e has. Keep an eye out for interesting CBS reports.

Modern SAP BI solutions – is it time to upgrade?

Last week I attended a SAP BI meeting hosted by the University of Utrecht, concerning the direction SAP is going with the BI solutions BW4HANA and SAC (SAP Analytics Cloud).

BW4HANA

The BW4HANA suite has a lot to offer and looks really promising. However, since the current BWonHANA systems are also pretty complete datawarehousing suites, it is a difficult business case to put a lot of effort in an almost new implementation. That’s probably why SAP is stopping support for the older versions, since remaining up to date is by itself important for the continuity of systems within companies. The current situation is: support for version 7.4 until end of 2020, support for version 7.5 until end of 2024.

My impression at the moment concerning an upgrade is, look for the right moment. I am not too fond of the eclipse front-end and building and maintaining business logic is difficult to debug in this environment. There is still time and opportunity for SAP to improve.

SAP Analytics Cloud

SAP Analytics cloud is the new BI solution for Data discovery. It is the successor of SAP Lumira 1.31, which I compared in this blog with Tableau, and also the successor of SAP Lumira 2.x Discovery, which was launched quite recently. I am very curious to see how it works and what functionalities are available. Apart from the data discovery, there will be functionality for planning, predictive analytics and application design. With the last topic, it enters the realm of the on-premise tool SAP Lumira Designer (the successor of SAP Design Studio).  SAP has put an end date to the on-premise environment as well, end of 2028, and is focusing for new developments on the cloud tools.

Personally, I am sceptic towards cloud tools since it is not possible to control the updates. Especially if applications will be build in them, with a lot of users, it can be really annoying if a certain functionality doesn’t work after the upgrade anymore. And although SAP is saying they will try to avoid it, I am 100% certain it will happen somewhere in the future, and I think users will not accept it.

To conclude: BW4HANA is great, but the moment to upgrade needs to be looked on carefully. SAC is the SAP BI option for data discovery, use that instead of SAP Lumira discovery if possible.

Tangible benefits of a good datawarehouse – part 2

T

he benefits of a datawarehouse are not easily quantifiable. Therefore I like to point them out when there is an opportunity, see also Tangible benefits of a good datawarehouse.

Last week the UvA received a positive quality assessment of the NVAO (the accreditation organization tasked with providing an expert and objective assessment of the quality of higher education in the Netherlands and Flanders).

In their review, the committee expressed its admiration for the way in which the UvA monitors its quality assurance using UvAData (the datawarehouse) and UvAQ (which reports are also build and distributed using the datawarehouse). For the full press release, click here.

With this positive quality assessment, the UvA saves a lot of time and effort that otherwise needs to be spend in order to thoroughly asses every program individually.

In conclusion: A good datawarehouse saves time and effort and has good business value.

Robots

One of the topics that is getting more attention lately is ‘Robotics Process Automation’ (RPA). At Tacstone we build a new proposition on this service. In essence this is a new software service that allows to automate processes, without having to modify the existing software systems. The software can logon, simulate keyboard- and mouse input, and reads information from the screen. This service is most useful for processes that are highly repetitive and with a relative high volume. In this blog I like to explain a use case we implemented a few years ago, based on a similar principle.

In our case, we wanted to make beautiful reports which we then could distribute as pdf. The tool we wanted to work with, SAP Design Studio (SAP Lumira Designer these days), allows for making these beautiful reports. Unfortunately, the standard function to print to pdf is very limited, and at the time in 2014, Design Studio 1.2, it was not possible to automate that as well.

To solve the problem of properly printing, we developed an SDK component, that could handle this in the desired way. With this SDK component, it was possible to push a print button, and the default ‘print to pdf’ from you browser shows up, with the lay-out just right. All you need to do now is ‘save as pdf’.

This was already a huge progress. At the time we thought that since scheduling is a basic functionally that SAP had built in the past for their Bex tools, SAP will build this in the new tools as well. Considering this, we might as well have an intern or something, manually running these reports and saving them as pdf. On average the intern can do maybe 20 in an hour, so that might be ok. So there we had a fallback scenario to continue on this route.

But of course this sounds like a horrible job, so we tried to figure out a way to automate this. A long story short, with the use of PhantomJS, Java and ABAP, we managed to build a robot that was able to mimic this human interface, without having to wait for standard SAP functionality.

At the moment, the robot has been in production for more than three years, and has produced more than 80.000 pdf reports, and is still going strong. Had we had an intern doing this manually, it would have taken this person more than 4000 working hours! Making this robot was therefore a sound investment. Also, the standard functionality SAP is offering about scheduling is, not as sophisticated as we need. So although the robot started out as a temporary solution, it is becoming pretty permanent.

In conclusion: A robot to replace your human interface is worthwhile considering!

Scrum please!

Last week I had the pleasure of attending a Scrum Master training, given by Zilverline. I have to admit, I was skeptical at first about Scrum. I am a trained Prince2 Practitioner, so I figured, this is probably “just old wine in new bags”, as we say in Holland.

And in a way that is the case of course. But still, the principles of Scrum are very attractive. Especially since I realized I was already working very agile. This has mainly to do with the environment and product I work with. In the environment I am a trusted resource, that has quite a lot of freedom to put product increments in production. The product owner is “a man with a vision” who can look at the bigger picture. And the product is BI, where quick prototyping is needed to demonstrate the potential of a new report, and implementation time from idea to product can be fast.

What I liked about Scrum is the limitation of overhead. 4 meetings, 3 roles, 2 lists. Compared to Prince2 that is a lot less overhead to maintain. The roles are very limited and clear. The role of the Product owner makes a lot of sense, especially since this role is carried by one person.

The Planning Poker was something I like very much as well. I think this can be a good teambuilding exercise, and also, it focuses on the principle that estimating is very difficult, and should be regarded as an exercise that should not be the main goal.

It reminds me of a quote from Terry Pratchett, Going Postal:
“Mr. Pony struggled manfully with the engineer’s permanent dread of having to commit himself to anything, and managed, “Well, if we don’t lose too many staff, and the winter isn’t too bad, but of course there’s always—”

So in that sense, if you do not have to commit completely, but just go for an order of magnitude, you get quickly a good enough estimation of the workload. And you can start the work!

In my current project, we could do with a bit more structure and additional teambuilding is necessary to tear down the artificial organizational boundaries. I am convinced Scrum will be helpful in achieving that goal.

I passed the test, so I can now call myself Scrum Master. I do feel like a sage!

To conclude: Scrum please!