top of page

The Prism Analytics Journey, Transforming Data

Updated: Apr 4

It’s been an interesting Journey of Prism Analytics blogs so far, and now it’s time for Part-3. In the previous two blogs, we covered what Prism Analytics is capable of, reasons to use it, and knowing that your data is secure. We also got into making Prism Analytics available, how to have your data stored into tables, where the transformations happen, and publishing a derived dataset.


In this blog, we present some of the functions and transformations you can do with your data. Also, we will explore Joins, Unions, Group By, Filters, and a few calculations.


Once your tables are created, it’s time to bring them together and work with the data.


This example combines Workday data with non-Workday data coming from a third-party LMS. We are using 70 rows with Employee ID, Name, Grade, and Test Date. The Workday report uses “Active Employee – Indexed” to bring in basic employee data. For the report to be available for Prism Analytics, don’t forget to “Enable for Prism” on the report advanced settings.



* Creating a table from a CSV file from a different system


Now it’s time for action! On the Data Catalog, click on “Create” and “Derived Dataset”.




Workday will prompt you to select which table will be your primary pipeline. For this demonstration, our primary pipeline will be “Active Employees”. Based on best practices we can name Derived Datasets with a prefix “DDS”.



Once the Derived Dataset is created we can go to quick actions and select  “Edit Transformations”, that’s where we can start adding additional pipelines and work with the data.



Next, we will cover these functions: Union, Filter, and Group By. More functions coming in future blogs such as Join, Manage Fields, and Validations (see below).


1.      Union


When we use the “Union” function, we want to combine data with the primary pipeline. On this stage, we will need to define what columns from the additional pipeline match with the primary pipeline, this way the data will be added appropriately.


On the union process, you can add as many fields you want from each pipeline, they can be matched or not. If not matched they will be in the separate column.



Once you click “Done”, you will have the Union Output, and the columns “Grade” and “Test Date” were added to my dataset. Because I matched “Employee ID” on this demonstration, only the Employee IDs that matched will have the values for those 2 fields.

 

 2.      Filter


Filters can be added in a standard fashion, or convert to advanced where you can apply nest and complex filters to your dataset.



When you decide to apply advanced filters, a command box prompts you to enter a filter computed expression.



3.      Group by

 

“Group by” is one of my favorite functions in Prism Analytics. With “Group by” you can group your data by your fields, and order them as you desire. You can also add summarization fields (Count, Sum, Max, Avg, and Min).

 

For our example, we will group by Employee ID, Full Name, Grade, and Test Date, those are the fields that will be in our report going forward.

 

We will also use Count, Max, Min and AVG add summarization fields on our report.

 

Let’s say your population had different exams, each exam had a grade and you want to know the highest, lowest, and the average grade for each student. With “Group By”, Prism Analytics will combine those rows and give you the summarizations you select.



*All the data above is not real, mock data for demonstration purposes.


After we complete the Group By, this would be the result to our dataset.

 

As we can see, employee ID 21006, Sarah Davis took 8 exams in the year. Her MAX  grade was 8 and her MIN grade was 2, with an average of 5.87 based on all tests.

               

As we saw on the examples and functions above, the possibilities with Prism Analytics when transforming data are endless and we are just getting started. In Part-4, our next blog in this journey, we will be covering a couple different functions such as Join, Manage Fields, and Validation.

 

We hope you have been enjoying this Prism Analytics Journey and are very excited to take you further. Since the first blog, so much was covered that we encourage you to go back as needed, and apply all the concepts to your own Prism Analytics.


Author: Thales from Florida



 

166 views0 comments

Comments


bottom of page