≡ Menu

Modern SAP BI solutions – is it time to upgrade?

Last week I attended a SAP BI meeting hosted by the University of Utrecht, concerning the direction SAP is going with the BI solutions BW4HANA and SAC (SAP Analytics Cloud).

BW4HANA

The BW4HANA suite has a lot to offer and looks really promising. However, since the current BWonHANA systems are also pretty complete datawarehousing suites, it is a difficult business case to put a lot of effort in an almost new implementation. That’s probably why SAP is stopping support for the older versions, since remaining up to date is by itself important for the continuity of systems within companies. The current situation is: support for version 7.4 until end of 2020, support for version 7.5 until end of 2024.

My impression at the moment concerning an upgrade is, look for the right moment. I am not too fond of the eclipse front-end and building and maintaining business logic is difficult to debug in this environment. There is still time and opportunity for SAP to improve.

SAP Analytics Cloud

SAP Analytics cloud is the new BI solution for Data discovery. It is the successor of SAP Lumira 1.31, which I compared in this blog with Tableau, and also the successor of SAP Lumira 2.x Discovery, which was launched quite recently. I am very curious to see how it works and what functionalities are available. Apart from the data discovery, there will be functionality for planning, predictive analytics and application design. With the last topic, it enters the realm of the on-premise tool SAP Lumira Designer (the successor of SAP Design Studio).  SAP has put an end date to the on-premise environment as well, end of 2028, and is focusing for new developments on the cloud tools.

Personally, I am sceptic towards cloud tools since it is not possible to control the updates. Especially if applications will be build in them, with a lot of users, it can be really annoying if a certain functionality doesn’t work after the upgrade anymore. And although SAP is saying they will try to avoid it, I am 100% certain it will happen somewhere in the future, and I think users will not accept it.

To conclude: BW4HANA is great, but the moment to upgrade needs to be looked on carefully. SAC is the SAP BI option for data discovery, use that instead of SAP Lumira discovery if possible.

Tangible benefits of a good datawarehouse – part 2

T

he benefits of a datawarehouse are not easily quantifiable. Therefore I like to point them out when there is an opportunity, see also Tangible benefits of a good datawarehouse.

Last week the UvA received a positive quality assessment of the NVAO (the accreditation organization tasked with providing an expert and objective assessment of the quality of higher education in the Netherlands and Flanders).

In their review, the committee expressed its admiration for the way in which the UvA monitors its quality assurance using UvAData (the datawarehouse) and UvAQ (which reports are also build and distributed using the datawarehouse). For the full press release, click here.

With this positive quality assessment, the UvA saves a lot of time and effort that otherwise needs to be spend in order to thoroughly asses every program individually.

In conclusion: A good datawarehouse saves time and effort and has good business value.

Robots

One of the topics that is getting more attention lately is ‘Robotics Process Automation’ (RPA). At Tacstone we build a new proposition on this service. In essence this is a new software service that allows to automate processes, without having to modify the existing software systems. The software can logon, simulate keyboard- and mouse input, and reads information from the screen. This service is most useful for processes that are highly repetitive and with a relative high volume. In this blog I like to explain a use case we implemented a few years ago, based on a similar principle.

In our case, we wanted to make beautiful reports which we then could distribute as pdf. The tool we wanted to work with, SAP Design Studio (SAP Lumira Designer these days), allows for making these beautiful reports. Unfortunately, the standard function to print to pdf is very limited, and at the time in 2014, Design Studio 1.2, it was not possible to automate that as well.

To solve the problem of properly printing, we developed an SDK component, that could handle this in the desired way. With this SDK component, it was possible to push a print button, and the default ‘print to pdf’ from you browser shows up, with the lay-out just right. All you need to do now is ‘save as pdf’.

This was already a huge progress. At the time we thought that since scheduling is a basic functionally that SAP had built in the past for their Bex tools, SAP will build this in the new tools as well. Considering this, we might as well have an intern or something, manually running these reports and saving them as pdf. On average the intern can do maybe 20 in an hour, so that might be ok. So there we had a fallback scenario to continue on this route.

But of course this sounds like a horrible job, so we tried to figure out a way to automate this. A long story short, with the use of PhantomJS, Java and ABAP, we managed to build a robot that was able to mimic this human interface, without having to wait for standard SAP functionality.

At the moment, the robot has been in production for more than three years, and has produced more than 80.000 pdf reports, and is still going strong. Had we had an intern doing this manually, it would have taken this person more than 4000 working hours! Making this robot was therefore a sound investment. Also, the standard functionality SAP is offering about scheduling is, not as sophisticated as we need. So although the robot started out as a temporary solution, it is becoming pretty permanent.

In conclusion: A robot to replace your human interface is worthwhile considering!

Scrum please!

Last week I had the pleasure of attending a Scrum Master training, given by Zilverline. I have to admit, I was skeptical at first about Scrum. I am a trained Prince2 Practitioner, so I figured, this is probably “just old wine in new bags”, as we say in Holland.

And in a way that is the case of course. But still, the principles of Scrum are very attractive. Especially since I realized I was already working very agile. This has mainly to do with the environment and product I work with. In the environment I am a trusted resource, that has quite a lot of freedom to put product increments in production. The product owner is “a man with a vision” who can look at the bigger picture. And the product is BI, where quick prototyping is needed to demonstrate the potential of a new report, and implementation time from idea to product can be fast.

What I liked about Scrum is the limitation of overhead. 4 meetings, 3 roles, 2 lists. Compared to Prince2 that is a lot less overhead to maintain. The roles are very limited and clear. The role of the Product owner makes a lot of sense, especially since this role is carried by one person.

The Planning Poker was something I like very much as well. I think this can be a good teambuilding exercise, and also, it focuses on the principle that estimating is very difficult, and should be regarded as an exercise that should not be the main goal.

It reminds me of a quote from Terry Pratchett, Going Postal:
“Mr. Pony struggled manfully with the engineer’s permanent dread of having to commit himself to anything, and managed, “Well, if we don’t lose too many staff, and the winter isn’t too bad, but of course there’s always—”

So in that sense, if you do not have to commit completely, but just go for an order of magnitude, you get quickly a good enough estimation of the workload. And you can start the work!

In my current project, we could do with a bit more structure and additional teambuilding is necessary to tear down the artificial organizational boundaries. I am convinced Scrum will be helpful in achieving that goal.

I passed the test, so I can now call myself Scrum Master. I do feel like a sage!

To conclude: Scrum please!

Enterprise deep learning with TensorFlow

An interesting course offered by openSAP is Enterprise deep learning with TensorFlow, which is currently running in its last week. I found this a great insight in the current state of machine learning possibilities.

It was a very hands-on training where it was possible to play with TensorFlow Applications, an open-source library for numerical computation. For SAP, TensorFlow is a key element in the SAP Leonardo Machine Learning architecture. With SAP Leonardo, SAP aims to make machine learning euasy to use for businesses.

Deep learning is a sub-field of neural networks, machine learning, and artificial intelligence. It is inspired by the architecture of the human brain and consists of neural networks with many layers.

Deep learning is a promising approach when:

  • there is a large amount of training data available
  • it concerns solving an image/audio/natural language problem
  • the raw input data has little structure and it is needed for the model to learn meaningful representations (e.g., pixels in an image)

One of the topics in the course was about convolutional networks. Convolutional networks are used to classify objects on pictures. The complexity to do this is enormous, but with combining several techniques and doing smart optimizations it becomes possible.

Also some examples of use cases were given. One of them was a Medical Image Segmentation with Fully-Convolutional Networks. In this example images retrieved from an MRI scanner are processed with a fully convolutional network to construct a new image that points out possible cancer cells.

I found the explanation on how to deal with unsupervised and reinforcement learning very informative as well. To explain:
Machine learning applications fall into three broad contexts:

  • Supervised learning; in this case there is dataset with labels or annotations. Usually this dataset is not too big, because it is costly to label all the data. Most machine learning is done with these.
  • Unsupervised learning; in this case there is a data without labels or annotations. Typically this data is generated with machines or software, in an internet of things kind of way. With machine learning there are techniques to identify anomalies and outliers of the data. Making good use of this data. An example can be a financial pattern that is monitored. When an anomaly occurs, this can be due to fraud.
  • Reinforcement learning; In reinforcement learning there is no initial dataset. The dataset is accumulated with experience. The machine learning agents interact with the environment in an trial an error kind of way. An example is a robot learning a task. It performs actions, and when the action is correct it is rewarded, when the action is incorrect, there is no reward and a penalty.

Another inspiring example was the generating of new images using GANs (Generative Adversarial Networks). In this example a generator generates images, and this is combined with an discriminator that determines if the pictures is a real or fake image (i.e. blurry). This approach gives impressive results.

To conclude: another very inspiring course from the Open SAP learning environment. Very useful machine learning techniques for businesses were presented.