Case study: Collective problem solving to evolve business processes using digital tools
Common Good AI's vision is to foster inclusive civic engagement by transforming how communities find common ground and solve problems together. We are actively piloting the use of digital tools, including CrowdSmart, to inform our community engagement approach. This early case study documents lessons learned based on informal key informant interviews.
The Background
Common Good AI worked with a nonprofit organization focused on improving educational outcomes by enabling students’ equitable access to resources. Every six months, the organization conducts a staff performance review process. Staff found the Goal & Development (G&D) process cumbersome and time-consuming. The senior management team held multiple meetings and staff discussions to identify ways to improve the process.
The primary goal of this digital collaboration was to enable the organization to efficiently review and ideate strategies to simplify and streamline their internal business process. Prior to the collaboration, organization identified four levers of change to simplify the G&D process, which informed the digital collaboration design:
Change the frequency and/or timing of review cycles
Simplify internal rubric
Change how rubric ratings are used to determine compensation
Increase emphasis on self-assessment
The Process
We developed a digital collaboration to understand staff perceptions of the G&D process implemented ni December 2024, the effectiveness of recent changes and gain recommendation to further evolve and streamline the process. In collaboration with organization, we created the following prompts:
The G&D process is an effective vehicle to help me understand my strengths, areas for growth, and what I can do to improve and progress in my job.
My own experiences and perspectives, captured through my self-assessment, played a meaningful role in my review and development.
How would you rate the implementation of the 2024 end-of-year G&D process? In terms of clarity of instructions, documentation requirements, coordination with manager(s), etc.
What recommendations do you have to enhance the G&D process for 2025? Enter each idea in separate boxes.
We launched the collaboration via email to 71 staff, explaining the goals and CrowdSmart tool. A reminder email was sent to the staff to encourage continued participation. The collaboration was open for approximately 3 weeks from February 16 to March 6.
The Results
Overall, the digital collaborated gauged staff perceptions and identified specific actions to improve the G&D process for the future. The team generated over 230 ideas from 41 participants. The collaboration yielded three consensus insights:
Staff felt rushed; there was not enough time to complete the required G&D steps and the overlap with the holidays stretched many individuals.
Staff appreciated the self-assessment; it grounded the discussion with their supervisors and allowed them to focus on what needed the most attention, although they preferred keeping score sharing optional.
Staff found the rubric challenging; there were concerns about how long and complex it was, while others struggled to highlight their accomplishments within a rigid framework.
We created a report detailing insights gleaned from the digital collaboration and provided six recommended changes to the process based on staff feedback. The senior management team implemented four out of the six recommended changes:
Modify timing by switching the review to once a year and not during the December holidays
Create a process for staff to provide feedback to their supervisors
Create an open-ended section in the rubric, so staff can highlight their accomplishments that may not fit into the existing structure
Make score sharing from the self-assessment optional
Perhaps the most valuable result was the learning from the process itself. After the digital collaboration, we interviewed a human resources (HR) manager to get her feedback for this case study. She provided her perspective on the CrowdSmart tool and commented, “It was a win for the team to get outside their own perspective and see what other staff members said. It became less of a me against a process and more collective problem solving in a way that was a huge shift for our culture.” Using CrowdSmart allowed the organization to take a problem that would have traditionally been HR’s domain and turn it into an asynchronous dialogue with staff collectively problem solving and engaging with each other’s perspectives while advocating for their own.
With the CrowdSmart collaboration, staff could return multiple times to provide feedback to the prompts. This process allowed participants to interact with new ideas, different from their own, and ideate together. The HR Manager said, often with traditional surveys, she got requests from the staff to provide additional inputs after the deadline. Staff would often say, “‘Oh, one more thing I didn’t think about’ or “It’s been a week and now my thinking is different” but there were no mechanisms to change answers in the survey. With this digital tool, the iterative process allowed staff to see feedback from their peers, then build and evolve their own answers and responses over time.
In addition, the HR manager received feedback from her colleagues who loved “seeing and responding to others' feedback, as opposed to feedback going into a black box.” The digital collaboration highlighted that staff valued shared reflection; they wanted to think together. This spirit of reflection was amplified in this digital collaboration process. The HR manager said the team decided, based on this collaboration, to move from twice to once a year review cycle, “which felt undoable a year ago.” The collaboration allowed the organization to make an informed decision more quickly based on buy-in from their staff and senior management created through the digital tool. The platform provided transparency to an often opaque process and accelerated decision-making process to inform organizational policy changes.
What We Learned
We built, deployed, and analyzed this collaboration from start to finish. Through this engagement we learned that:
Reflecting on others' ideas promotes new thinking. There is tremendous value in seeing other peoples ideas to evolve your own. By interacting with other ideas and coming back to provide answers more than once, participants feel less pressure to react and more able to meaningfully reflect.
Visible consensus promotes transparency and accelerates buy-in. The process opened the “black box” where organizational decisions are made. Being able to see what the group thinks and where there is consensus accelerated buy-in to enable new solutions to be implemented more quickly.
To improve for the future, show how consensus was established. We got feedback that participants wanted more concrete data to understand why certain ideas were listed higher than others. CrowdSmart did a good job of highlighting areas of agreement, but is less adept at showing justification or rationale for the consensus statements. In the future, additional data could be available, like rankings, to back up prioritizations.