Monday, April 17, 2023

Smashing Bugs

 As I wrote about last week, I have a project that went live.  While I am really excited about this, I am now dealing with the downside – bugs and additional desired functionality that was never brought up while developing the specifications.

 

Bugs 
Photo Credit: National Museum of Computing.      Attribution-NonCommercial-ShareAlike.  https://archive.org/details/emfcamp_photo_15191485628


To be fair, not all of the issues are really bugs, some are just a matter of users using the system in unintended ways.  As a system analyst and the designer of the project, I am also trying to shield the developers from the multitude of user noise, gain an understanding of the problem, and relaying this information to the developers.  For me, this is a great way to learn how the users are using the system and helps me understand what other enhancements might be beneficial in streamlining the work and removing waste from the value stream.

 

The first report that we received was that a user couldn’t save their input because the entire page wasn’t loading.  As we are handling transactions for multiple employers and the line number request system is only needed for one employer, I was somewhat surprised to get a report from a user for the employer that does not need to have account numbers assigned.  I submitted a ticket and spoke with the developer.  It turns out that there was a bug in the code.  Somewhat simplified, the front end got to the point where is was displaying the data for a line number and the back end said, “I don’t know what you are talking about.”  The front end said, “That’s okay, I have all of eternity to wait…” and so it did.  According to the developer, this was an easy resolution but it came up again a few days later.  Crossing my fingers that it is fixed, even thought I didn’t code it, I am the face of the enhancement.

 

The second report that came up was due to a user being unable to save data.  This was definitely a user error as the user had the transaction set to void when she was attempting to save the data.  The system doesn’t like this status and refused to go along with the user’s request.  I don’t know if this issue was pre-existing or part of the new code.  Regardless, it seems to me that the user should get a warning or error message in this scenario and I have added it to the future enhancements list, ranking it right up there and probably trying to incorporate it into our data validation project.  As data validation has been started, we may wait for the next iteration of that project to include a message for that error as well as a similar error where the same datum was left as null.

 

Of course, there is the issue that was directly my fault, I take full responsibility for picking the wrong field for a data validation point.  I try to make my projects as ready to code as I can for our developers.  This includes detailed diagrams, telling them which fields in the database to reference or write.  One critical piece of information is displayed in two different areas.  It turns out that the area that I picked isn’t always populated, making it worse, the other area that this information exists in isn’t always populated, either, but it is a critical piece of data as a line can’t be assigned without one.  As it is only a binary, y/n, flag, we may just add the ability to select if the data isn’t available in the requisition.  It is an important lesson for me to validate the fields better than I did with this example.

 

Then we come to my favorite two issues.  The lack of a batch process and the lack of emails for the recruiting team to track their process with.  As it turns out, nobody said anything about the need to be able to request lines as a batch.  It is a well-known need, but it didn’t come up at all while we were doing our needs analysis and workflow documentation but it happened, a recruiter sent me a message with 30 reqs.  She wanted to know if she had to request each one individually or if she could just email the list to the transactions team that processes the line requests.  I had to remind her that the purpose of the project was to eliminate the emails due to the volume of emails and the amount of missing information.  She wasn’t pleased but I did offer to devise a method for this to be addressed in a future release.

 

My other favorite issue was from the same individual.  She said that she relies on the email communications to track when thing were requested and when they were completed.  She specifically cited the value of the time/date stamp on the email.  For me, this is an easy fix, let’s produce a report.  I explained to the user that we have audit logs in the system and I could easily develop a report for her and have that set up in the system so that she can pull it on demand.  We worked together to figure out what the most pressing information was and developed a prototype.  She was okay with the sample and I am now waiting for her okay to publish the report in the system.  Alternatively, if she requests me to pull the report for her again, I will just have it published in the system.

 

While none of these were major issues, they are inconveniences for the users and my goal and intent is to make things easier for all users involved.  While the teams that I am working with are very supportive of what I am trying to do, I hate that we can’t simply release such an enhancement without the issues.  I am sure that, as I get better at what I do, we will get better at publishing projects that do not have these issues.   We should find out soon because another project that I am working on will have it’s first phase go live next week and we can cycle the bug reporting challenges all over again.

Sunday, April 2, 2023

It went to production!

It took a while. The prework, the process development, the waiting for development, the development process… but it finally went live, and my simple little process is now in our production environment.

The Problem

The problem was a simple communications problem. In our recruiting process, we need to assign accounting line numbers to our job requisitions. These are usually requested by the recruiter from our transaction processing team once the candidate has been selected but has not yet started. Unfortunately, the communication around this process was email based and frequently became confusing and challenging for both the recruiting team and our transaction processing team. Having witnessed the issue firsthand, I set out to find a solution. What if they could just push a button on our web interface?

 

Too much email, too little detail

For a line to be assigned, we need certain information. This information is collected in the job requisition, but is does not always get passed along with the transaction processing team.  This creates additional communications (lean concept: waste) between the transaction processing team and the recruiter. The required information is:

  • Requisition Identification Number
  • Job Title Identification Number
  • Job Title
  • Department Identification Number
  • Department Name
  • Full Time Equivalent (FTE)
  • Pay Basis
  • Account Number(s) where the expense is allocated to   
For the application, we needed additional information:
  • Request Date
  • Requestor
  • Recruiter

With this information, we can automate the request process. Just a push of a button and the request can be made. With this information being in the system, the system can call this information and pass it to the request processing team for processing. The transactions processing team can then see the requests on their dashboard. Of course, sending the transaction processing team a notice that action is required and a confirmation to the recruiter that the process is in motion helps with the human factor until we can fully automate the line selection process.

 

What does this process look like?

In conjunction with representation from recruiting and the transaction processing team, we developed this flowchart to show what the process looks like. I am the rare bird that gives the development team flowcharts and identify the necessary database attributes, but they seem to appreciate it and it makes the coding process more efficient than trying to otherwise explain what we are trying to accomplish.

Request-A-Line Flowchart


There were changes in the development process after coding started that were not captured in the flowchart. My favorite improvement came up from recruiting where it will display the required codes and inform the requestor to cancel out of the confirmation notification to make any changes if any of the information is incorrect.

 

How is it working?

It is too soon to tell. I wrote up a brief how-to document for the recruiting team and the communication to the team is being managed by their leader.  The members of the recruiting team that engaged in the project’s development were quite excited and wondering what else we could apply similar logic and processes to.  The transaction processing team is small and even more instrumental in the development of the process. All-in-all, everyone is excited about the rollout and understands that things might not be perfect, and we will make updates to improve the process, as needed. In my opinion, the bigger win was helping people see that the don’t have to be satisfied with the system in its current state.  If they have an idea to improve the processes and pitch the improvement. Seeing people open their eyes to the possibility of change is more magical that we can ever do with these magical boxes of sand  silicon.

Sunday, March 19, 2023

Oh Tableau, I want to comment my code... my mind is like a steel siv!

Commenting code

I like to comment my code.  My professors made it a point to ensure that everyone understood the importance of comments in my code and I have tried to faithfully apply what I have learned.  Unfortunately, when we drift to applications like Tableau, there really isn't a great mechanism to make comments that wouldn't be visible to the end user of the dashboard.  To be fair to Tableau, Excel doesn't do this well, and I suspect that PowerBI doesn't do it either.  While we could make comments in our underlying SQL, I'm trying to get out as many records as I possibly can so that we can have it available in our data sources for the unexpected future needs.

Our data sources are being pulled directly from Oracle.  They start my my initial generation of a query based on an entity-relation diagram that I created to provide information the looks like what you would typically find in a report writer in an HRIS system.  These queries are then turned over to our IT team who then create a view, and then a materialized view which Tableau is linked to.  As much as I wish I could control the entire process, the folks that I work with are amazing, talented, and extraordinarily responsive.  I am fortunate to work with these people because I am always learning new things from them.

The "code review"

While our HRIS team is composed of four people and a couple vacancies, currently it is only my supervisor and I who have the technical skills.  I rely on my supervisor to help with the initial reality check on our dashboards.  While we were going through some of my work, we were noticing that some of the numbers weren't matching up and I was having problems understanding and explaining why in the moment.  One of the dashboards was developed months ago with some ongoing modifications to both the underlying query and the filters have been applied in Tableau made it impossible for me to explain, with certainty, the exact changes needed to be able to run a query against it to get the same information that was shown in the Tableau dashboard.

The magic of commenting code lets you know why you made specific decisions.  You might want to know days, weeks, months, or even years after you wrote your code why your code does something.  Just because I know why I applied certain filters in Tableau now, doesn’t mean that I will remember three months down the road.  Beyond a simple conversation with my supervisor, with whom I have a great relationship, I would hate to put my Chief Human Resource Officer (CHRO) in a position where he was having a hard time defending the numbers.

The old pen and paper

This problem isn’t insurmountable.  We have the old ways of doing things.  I try to be good about keeping notes pertaining to how things are done.  Frequently, this is in a paper notebook.  Compared to code comments, this is far from ideal.  With code comments, they are concise and available where you need them.  This is never the case in my paper notebooks.  My paper notebooks don’t let me remove the comment command characters to try the original code.  While I understand that having inline comments in Tableau may be challenging from a development standpoint, it is a big miss from my perspective.

The errors

Outside of the “code review”, I was able to identify some of the errors or issue that we were coming up with.  It didn’t take a particularly long time, but it definitely took longer that I would have liked to while I was in a meeting.  The answer was pretty clear once I found it in my notes and setting up a method to validate it was a simple process.  It was a filter on five job titles but in self-imposed pressure of the meeting, I could not come up with a mechanism to verify it.  Past the meeting, I could easily identify it, look up the correct job title codes, and create a simple query to share with my supervisor to give him the correct answer.

The code review was also helpful for identifying some other issues that I had.  Sometimes it is nice to just chat about the work and get perspective from another set of eyes.  Errors might be dumb but because you made them, you might not see what is right before your eyes.  An example of this revolved around a couple formulas on my dashboard that were giving me the same result.  There was no reason for them to be the same result and I checked them a couple times, getting the same result every time.  After our meeting, I was able to check it again and I must have been clicking on the wrong formula because there it way, front and center, my formula was clearly referencing the wrong attribute in the relation.

Feedback is a gift

As an HR Manager, I have told people this very phrase for years.  Feedback is a gift.  Feedback allows you to learn the perspective of someone else.  If offers you a unique insight that you may not otherwise have.  Right or wrong matters not, it is an opinion.  You have a choice on what to do with this feedback.  I value these “code reviews” for the feedback gained.  I have learned a lot through these exercises.  Sometimes it is a matter of someone asking me the questions that I haven’t asked myself.  In the end, regardless of agreeing or not, I find the feedback extraordinarily valuable.

Feedback for the Tableau development team

Tableau development team, if I may, the ability to have comment my dashboards so that I can remember what decisions I made, when I made the, and why, would be very helpful.  I am sure that I am far from the only user who would appreciate this feature.  If it does exists, please let me know where because none of the Tableau experts at my institution were aware of such a feature in your otherwise amazing product.

Thursday, March 9, 2023

What the ETL....

We have a problem...

As part of our HR dashboard development we are putting together a time-to-fill (TTF) dashboard.  Time to fill is a measure of how long it takes to fill a position.  For our purposes, we are breaking it down to help identify where there may be bottlenecks in our process.  Specifically we are looking at the following intervals:

  • Requisition Receipt in HR to Posting
    • This is the date that the requisition is received in HR.  This may have been through a formal approval process or is not required to go through the formal approval process.  Either flow requires a requisition to be completed.  The requisition provides HR with the pertinent information required to be able to start the recruiting process.  This is measuring the time in days between when HR receives the requisition in the ERP until it is posted in our applicant tracking system (ATS).
  • Posting to Candidate Selection
    • The starting point of this measurement is when the recruiter enters the data received in the previous step and ends when the recruiter sets the status to candidate selection to indicate that the requesting department is screening and interviewing candidates for the position.  Some positions may take longer due to factors such as availability of talent in the market or internal posting requirements.
  • Candidate Selection to Offer
    • Candidate selection includes any resume screening, phone screens, interviews, etc... this process continues until there is an accepted offer.  The offer must be accepted because if we stopped this stage when an offer is made, someone may not take the offer, resulting in days or weeks being added to this step in the process.
  • Offer to Start
    • This phase remains largely outside of the organization's control.  While the organization can establish time limits on tasks such as drug testing, much of the process is within the candidate's hands.  Some candidates might get their paperwork completed and documentation submitted quickly while others will move much slower.  Some candidate want to give their current employer a longer notice period than others who may not need to give notice.  If the orientation session is full, we may be waiting an additional week for the next orientation session.
  • Time-To-Fill (TTF) is calculated from when HR receives the requisition until the offer letter is signed.
  • Time-To-Start (TTS) is calculated from when the HR receives the requisition until the employee reports to orientation.

This is great but... what is the problem?

We have a few factors working against us here.

  • Data is an afterthought.  Not only was what we wanted to calculate an afterthought, but so was what we would consider in the calculation, where the data is located, how it is stored, and how do we get it someplace where we can work with it.  In talking with my colleagues in our Data Professionals group, this is a common scenario.
  • Pooling of positions helps save the recruiting team time, but is makes the data messy.  Pooling is the process of combining several jobs or requisitions under a single posting.  This is great for recruiting because it makes the process of posting a position much quicker as they don't need to do as much work for the pooled position.  It is also nicer for applicants because they don't have to see that we might have 20 positions posted for a single title.  It is a great process and I highly support it, but it means that we have to do some extra work with our data.
  • Lack of consistency in our input.  Just because we use a number, doesn't mean it is is a number.  In an of itself, this is fine.  Unless we need to calculate a value, it is best to consider using a string rather than a number.  Strings made of numbers are great identifiers.  It is okay if we consistently use a letter to identify that a job is posted to an agency, but people aren't always consistent resulting in that agency identifier to be missing and our posting number then looks like any other posting number that isn't posted to an agency.  Considering that the field isn't a numeric field, recruiters sometimes decide that it would be helpful to have a note in the field rather than the requisition number.  Oh people, I love them.
  • Some of our data is in this system and some of it is in that system and we didn't pay to have the systems talk.  Of course, the API documentation doesn't indicate that all the data that we need is available in the API, either.  The report writer in the ATS appears to have a lot of data available that isn't available in the API.

Time for the duct tape an bubble gum, what the ETL!

Not getting the data from one system to another is not an option.  We need to figure out how to Extract, Transform, and Load (ETL) the data.  The data between the two systems are not compatible in their basic forms and need to be massaged.  This massaging process includes extracting it from the ATS, transforming it to be compatible with the data in our ERP, and then loading it into the ERP so that we can access it in Tableau.  We're still working through this process but are nearly there and I love sharing the challenges and learnings that I have had in the process.

Extraction 

Before we could even extract the data, we had to define what it is that we want.  As good as the report writer is, the way that the data is organized isn't particularly intuitive and the ATS documentation doesn't dig very deep into what data is stored where or what the data means.  As we are extracting a report that we are building, there was a lot of back and forth looking at the postings and the available attributes in the report writer.  After a lot of moving fields, verifying they are what I think they are, and more verification, we arrived at a draft report for export.  Of course, after exporting it, I realize that there was way too much data to run it through PowerQuery in Excel - so back to the report to set up some filters to limit the data and then back to Excel.  After a few of these back and forths, I was able to generate a report that was small enough to be reasonably transformed in PowerQuery in Excel.

Transformation

What can I say, the data is messy - much more messy than I anticipated.  I know enough about PowerQuery to be dangerous and I set off to make the data useful.  The biggest problem that I face is that the requisition system in our ERP generates a requisition number that is not a direct match for what is in our ATS.  The ERP's number is system generated and consistent, the ATS's is entered with human hands.  As previously mentioned, sometimes we have a number, sometimes we don't and sometimes we have additional characters or notes where the requisition identifier should be.  Let's take these one at a time.

  1. When we don't have a number for our individual position and in some cases we have a 1, I know the number is the posting number.  While consistency in procedure would be nice, we live in the real world and people are not as consistent as computers.  A simple formula helps us with this.  Creating a new column called FinalReqNo, we check to see if position number is null or 1, if it is, we copy the posting number to the new column, if not, we copy the existing value to the new column.
  2. When we have a number, it is usually straight forward and we should be able to use that number.  It should be a five digit number greater than 60000.  I hate to drop too many lines, but we have a lot of requisitions to have what I hope to be a statistically representative sample size.  To make it easy, I split the column at five characters.  I want to keep this data so that we can go back and look at what is being cut off and inform the recruiters about the records that we need to fix.
  3. At this point, running the steps in this order, we should have stripped out any text.  This lets me continue with the process and, if we get errors loading the data into Excel, I go back to see what the errors are and either ask the recruiters to fix the source data or tweak the formula to handle the errors.

Beyond this, I have procedures to reorder the columns and set data types.  Fairly simple tasks that help me when staring for this data for hours at a time.

Load

Finally, we can load the data.  If all goes well, the data will load without errors.  If we have errors, as mentioned above, I examine the errors and request the source data to be fixed or adjust the transformation steps to accommodate the errors.  At this point, I repeat the process with out ERP data but the ERP data is our source of truth when it comes to the requisition numbers so the work not so intensive and the process is largely changing some formatting.  I still run it through PowerQuery to ensure consistency in the data as I am repeating this process manually for the time being.

For now, I am merging the data in Excel using Excel's Data Model feature and creating a pivot table.  So far this is working but this is not the end game.  The end game is getting the data into Tableau with automated processes.  This ETL process serves as a blueprint for the development team.  As an analyst, I try to make "code-ready" projects to turn over to our developers.  This seems to make them happy as it is not what they usually get, but if they can develop my stuff quickly, it is likely that I can get priority in the development queue because my projects are quicker to develop.

Next steps

Unfortunately, our ATS is lacking in some desired functionality so we are stuck with duct tape, bubble gum, and bailing wire to hold things together.  While we would love to have an API or SFTP to be able to transfer this data, we are not so lucky and have to rely on the ATS delivering via an email schedule.  Once we get this process rolling and are happy with our results, we may have the ATS provider write us a query but that costs quite a bit of money and we don't want to pull the trigger on such an expense until we are happy with the data in our dashboard.

The email delivery also posed a problem because I had a table calculation, so I am rewriting some of the process to happen in a Tableau data source vs the ATS.  I am happy, however, to get to the point that we are at.  This will allow us to finish the development of our first three scorecards that focus on staffing.



Saturday, February 25, 2023

Meet my new friend Tableau

I'd like to introduce you to my new friend, Tableau.  Why am I talking about Tableau and not the EAC and IHRSM projects?  Its quite simple, as of today, there is nothing new to publicly update about our ECA and IHRSM implementation projects although there is some background action.  It is really exciting to see these project moving ahead as they will be taking our institution to a new level and the improvements to the employee experience should be tremendous and the experience for the people who work in HR and Payroll should be greatly improved, too.

 

My new friend Tableau... a little background information.

Our institution has been trying to move to a more data-centric organization.  Every department works at it's own speed, given its own needs, talents, and abilities.  The institution, as a whole, has seen enough value in being data-centric that we have invested in Tableau and a small team that really leads the charge to build dashboards for various departments.  Better yet, they are empowering departments to build their own dashboards so when I learned about that, I jumped at the opportunity to take their Tableau training.  Unfortunately, I couldn't develop the interest in my previous department to take on a dashboard project.

For the uninitiated, Tableau, according to their website, is "a visual analytics platform transforming the way we use data to solve problems—empowering people and organizations to make the most of their data."[1]

I have always been fascinated with HR metrics and have pushed companies that I have worked for to pay more attention to what the HR numbers are saying.  My typical experience is that any HR numbers that were published were to serve a financial or basic operational need, not to tell the story of what is happening with our workforce.  When the opportunity came up I volunteered and, wow, have I learned a lot along the way.

Through a series of meetings, we decided what we wanted to display.  Using Tableau wasn't off my radar, but I have done similar projects using Excel and they were typically for my own use to be able to talk about where we stand within our HR practice.  As I have done a lot of this with Excel, it was a logical starting place for me.... big mistake.  While I won't say that Excel can't handle large data sets, these are the largest data sets that I have ever worked with and Excel was sluggish, at best, to process them.  Stripping down the data sets was eliminating information that I thought I might need as we refined our requirements.  Given these struggles, I took in upon myself to start developing the dashboards in Tableau.

 

Greetings and Farewells

Our first dashboard is called "Greetings and Farewells" and presents a basic look at:

  1. Hires - last 13 months and last seven years
  2. Separations - last 13 months and last seven years
  3. Separation reasons for the last 13 months

 This is all fairly straight forward information and can easily be compiled by any organization to show if the recruiting function is keeping up with people leaving the company... after all, nobody will be staying infinitely.

As Tableau provides a reactive display, I encourage you to visit the original on the Tableau Public website to display the dashboard as I originally intended.

 

For this blog, I used a data set that I found on Kaggle.com as using my institution's data wouldn't be appropriate for this purpose.  

 

How does this differ from the real data?

Given that I am using a different data set, there are some differences in how I had to display this data.  Primarily, this data is static.  It will not change.  In real life, this data set will continue to grow.  I am lazy efficient and want to spend as little time as possible maintaining reports.  In both cases I set up a filter to limit the display to the previous seven years and the last 13 months.  With live data, we needed to stop the year display at the end of the preceding year for the annual chart and the preceding month for the monthly display.  This is established with a couple of calculated fields that were applied to the filters.

On our institutional copy, we have additional filters.  The departmental filter matters less and we are much more interested in the relation of this data to our executives, so our filters work up those hierarchies rather than a more traditional department relation.

 

Creating a data source

The Kaggle data set was also a little easier to use in that I didn't have to create it.  Our HRIS is homegrown and the ability to identify hire and term dates is a little more complex, particularly as you consider the multi-employer, non-employees, students, etc... that exist in the live data.  To create our Tableau data source I wrote a few queries in SQL, hitting multiple tables to capture the hire, separation, and promotion data, applying a category via a case statement for each type and a common set of attributes, and a union all between each query to create one final query to pull the data together.

Once the team was satisfied with the information, we created materialized views that we could then link to Tableau as a data source.  Ultimately, we are adding several queries to this data source to represent the employment life cycle, which I will discuss more in a future post.



[1] https://www.tableau.com/why-tableau/what-is-tableau

 

Monday, February 20, 2023

What in the world is an Integrated Human Resource Service Management (IHRSM) application?

In a very general sense, Gartner’s article, “Integrated HR Service Management Solutions Reviews and Ratings” defines an IHRMS as ,”…solutions provide holistic platforms by which organizations can manage their physical and/or virtual HR shared services operations and communications.”[1]  Somewhat more specifically, Gartner states that common features of these systems include content delivery, knowledge management, ticketing and routing for case management, tools for process management, and digital document management.[2]  Ultimately, we can see this as the continued evolution of employee self-service.

 

Some background information on employee self-service

Wikipedia’s article, “Human Resource Management Systems” states that employee Self-Service has grown out of the development of payroll automation, enterprise resource planning which started happening in the 1970’s.[3]  The article continues to detail how human resource information systems (HRIS) and human resource management systems (HRMS) were an evolution from these early payroll and ERP systems.  A natural evolution from these systems is to provide self-service to free up the knowledge workers in HR to focus on tasks that require their specialized knowledge.

 

A quick scan of SHRM.org has references as early as July 2000.  My search turned up an article titled “HR Systems: Powering a Systems Overhaul” by Joe Dysart discussing the Los Angeles Department of Water and Power’s Human Services Department’s PeopleSoft upgrade which added the ability for employee self-service benefits enrollment.[4]  While this is commonplace today, it is amazing to think that this was a fairly new application around the time I joined the profession.

 

As we see the growth in machine learning and artificial intelligence in the world around us, it would be appropriate to consider how these technologies have been impacting the world of Human Resources.  In the SHRM article, “HR and Chatbots Are Learning Together”, Jeff Mike, vice president and head of research ideation for Bersin, Deloitte Consulting is quoted as saying “In HR self-service centers, bots are automating high-volume tasks such as changing an address or updating benefits information… On the talent acquisition side, bots can deliver "a streamlined candidate experience for high-volume recruiting activities" and guide new hires through the onboarding process.”[5]  This illustrates the potential for impact on both the labor reductions for manual work in HR and the potential to improve the employee experience.  Furthermore, illustrating the growth and movements of these platforms, Gartner has predicted that “by 2023, 75 percent of HR inquiries will be initiated through conversational AI platforms.”[6]

 

Benefits and use cases

On their blog, Leena AI, a leading developer of IHRSM software, cited Gartner’s research illustrating the main functions of IHRSM to include:

·       Employee and manager content delivery via a dedicated HR portal.

·       Content knowledge bases.

·       Digital management of HR documents

·       Business process management tools

·       Case-ticketing and routing

·       Service-level agreement (SLA) monitoring 

·       Employee relationship support

·       Single sign-on (SSO) to transactional systems.”[7]

 

With the typical HR Service Management Technology stack illustrated as:

 

Gartner’s paper “Hype Cycle for Human Capital Management Technology, 2020” discusses benefits particularly as they apply larger (2500+ employee) and multi-location employers.  Specifically pointing out the advantages of control and process standardization, robust metrics and reporting tools that parallels the sophistication of IT or CRM systems.[9]

 

In my own experience, I am eager to learn more about which product was selected as part of our procurement process and seeing the benefits to my organization’s Human Resources department.  This is a game changer that, if properly implemented, will be embraced by employees, managers, and HR personnel alike.



Followers