How SAP’s Data Services Can Enable Your Learning Integrations
In this blog, learn how SAP’s Learning Data Services can help you regain bulk access to your data.
The move towards SAP’s SaaS Learning module has been beneficial for nearly everyone, but there have been some downsides. One such downside is the loss of previously flexible bulk export functionality. In this article, we explore the issue, and some out-of-box solutions, and reveal how SAP’s Learning Data Services can help you regain bulk access to your data.
Where’s My Learning Data?
By design (and to support the stability of the platform) the SAP suite doesn’t make bulk exports of large amounts of data easy. In some cases, bulk export is prohibited entirely. There are very common and practical reasons for this in a shared computing environment like the SAP cloud, but knowing those reasons doesn’t help to solve the challenges.
While out-of-box solutions to this issue exist and provide options, they all have their drawbacks:
APIs currently offer features allowing users to retrieve data in a manner designed for integration. Extraction is, however, one or two at a time and in response to specific API “questions”. Bulk data exports are not available. Instead, this method is designed to answer questions like “show me this student’s learning plan” and not questions like “show me all the students that have this item on their learning plan”.
Some tools that are presently available and are candidates for a bulk export, like Integration Center, do not currently use learning data. This feature is in the roadmap, but a solid data hasn’t yet been given.
Some of the previous “cheat” methods, like using a custom report, are fraught with risk as SAP has sunset the PRD tool for new reports. This was always a risky proposition as sometimes the report engine could be clunky, slow, and fragile.
These limitations make it extraordinarily difficult to feed hungry business intelligence tools or have Learning-driven integration events. Indeed, getting any large portion of meaningful data is an uphill struggle, if you need to get that data in a way that is not presently supported by the SAP Learning API.
Enter Learning Data Services
Thankfully, SAP has a separate offering that can satisfy the need for large-scale data output from the Learning module. It’s called data services. Rather than trying to feed the data through a small pipe (like the APIs), data services are flat-file exports of the table deltas from the Learning module data model. The data is delivered via Secure File Transfer (SFTP). This allows far more flexibility with the delivered size of the data and creates significantly less strain on the Learning module.
While you get access to a significantly larger volume of data here, the trade off is that the client end is responsible for more of the effort. There needs to be a place for the data to land, and the client must build and maintain the process to consume the data in a meaningful manner. There’s no “pre-digestion” of the data performed in the cloud. You get it as raw flat export files that you must manage.
Nonetheless, with this great responsibility comes ultimate flexibility. Even though you have to do everything to manage the data coming in, you can do whatever you need to do in order to make it fit your consumption needs.
Important things you should know about Learning data services:
The tables are bundled in three table groupings (SAP calls them “packages”). Package A is the most commonly used set of tables. B offers the somewhat lesser used tables and C the least commonly used tables. Pricing is based on the selected packages of tables.
This is an out-of-box offering from SAP. There is no real customization possible to the offering, except the ability to select the table packages.
What Do I Get?
Once you‘ve signed up you’ll receive these files via SFTP delivery:
A full extract of all requested tables in your selected packages. This will allow you to “prime the pump” on your receiving side to have a set of base data to work with.
A set of delta files (typically delivered nightly) that are an extract of the data rows produced for each table in your selected packages over the course of the delta period.
A set of key files (also typically delivered nightly) that are just the columns comprising the primary key of the tables in your packages. Because delta files do not include action flags, these key files will allow you to detect deletions from the table.
How Does It Work?
SAP will conduct a mini-project with you to ensure that the data is delivered properly and that your consumption process is working. While they do not help with the development of the receipt side of the process, they are available to answer questions about the delivery aspects.
Once you’ve signed off on their delivery you are ready to go into production:
You’ll request a full extract of the tables you’ve paid for by raising a ticket with SAP support.
Your SAP project team will enable nightly delta and key files for your feed from your production instance.
Your data receipt process picks up the files from the SFTP delivery site and consumes them.
Are There Any Risks to Be Concerned About?
As always, there are risks to consider for any integration:
Deltas can get out of sync with your main data destination. This can be remedied in different ways depending on your consumption process and its implementation, Though it would be wise to retain the delta files for a week or two to allow you to catch up if necessary. The emergency fix would be to enter another support ticket to get a full extract that allows you to reset your baseline.
Remember, your data will be delivered exactly as it is from the database. If your data contains special characters or carriage returns it will be present in the extracts. Be sure your consumption process can handle them (and has robust error handling in general).
In that same vein, full extracts are only delivered on demand via ticket request. They’re typically an exception (or at least a very occasional regular occurrence) and usually need to be managed differently than your standard delta files.
Sounds Good—What’s Next?
Wrangling the large amount of data your Learning module produces can be difficult with the normal out-of-box tools. Using the Learning data services offering can be a reasonable answer to data-starved integrations, reporting tools, and other interfaces.
Working with a preferred and experienced vendor will help make your transition to Learning data services much easier and will help you avoid some of the pitfalls.
About the author
Christopher Olive is a chief architect at Effective People. He is an experienced IT architect with a demonstrated history of working in the human capital management software industry.
Christopher is skilled in strategy, integrations, and migrations and experienced in system administration, databases, and SaaS implementations.