What CI strategy will you adopt as you redesign work?
Part 5 of a 7-Part Series on Using Collaborative Intelligence (CI) to Improve Company Performance
As I wrote previously, when speaking with company executives regarding how best to use Collaborative Intelligence (CI) to improve their company’s performance, I ask seven clarifying questions, the fifth of which is: “Now that you’ve analyzed the work, what CI strategies will you use to redesign it?”
I ask this question to describe generic CI strategies for redesigning work and to help executives think more clearly and broadly about the redesign work. Explicitly contemplating this question helps organizations identify additional ways to achieve performance improvement goals.
Regardless of the specific focus an organization adopts when pursuing CI-enabled redesigns, the work comprises one or more units. Figure 1 is a high-level depiction of a generic work unit. The work unit could be a process, a sub-process, or an individual task.
Work typically transforms inputs into outputs by undertaking one or more activities and, perhaps, making one or more decisions. The activities and decisions may be required to adhere to specific policies. In many cases, metrics measure the performance of one or more work components or provide comprehensive measures of the entire unit.
Any or all of these components can be impacted when what exists currently is redesigned to incorporate collaborative intelligence. The following generic strategies can be used, typically in combination, to guide the redesigns. (An example of their application will follow the strategy definitions, so examples are not included within the definitions.)
Please note that many of these strategies are appropriate for any technology-enabled change. They are not specific to CI. However, CI’s embedded intelligence allows for a broader application of these strategies to the work components.
Generic Collaborative Intelligence (CI) Strategies
Eliminate: The most straightforward strategy is eliminating things that no longer make sense or will be rendered unnecessary by related changes. In large organizations, many well-established work units create no net value but remain in place as vestiges of another time or entity (in the case of M&A). Rather than trying to assign such things to CI, it is best to eliminate them.
Change: This strategy examines the components to determine what changes in kind, type, or relationship would best support the achievement of the performance improvement goals.
Augment: One or more CI technologies are implemented to augment a human's ability to complete the work component.
"Outsource": Move responsibility for one or more work components to one or more CI technologies and assign humans the task of overseeing, validating, and explaining to humans what CI did.
Reimagine: Rather than make targeted changes to existing work, identify entirely new ways of doing this work, which would not be possible without the capabilities inherent in CI.
Putting the Strategies into Practice
The following describes the redesign of the software requirements gathering process for a software development organization. The strategies listed above are interspersed in the “CI-Enabled State” description to highlight them.
Current State
The requirements-gathering process that was in place is probably familiar. It was part of a software development lifecycle (SDLC) best described as “waterfall with sprints.” (If you don’t know what this means, don’t worry.)
The software development organization’s business relationship managers (BRMs) would hold a series of meetings with business sponsors and future users of the system under development. The sponsors and users would articulate their desires, clarify them (via questions from the BRM), prioritize them, and negotiate the initial project scope.
After these meetings, the BRMs would prepare a list of requirements and related documents, then distribute them to the sponsors and future users for review. Based on interviews, the sponsors and future users admitted they could not validate these documents nor remember who proposed certain things because they could not remember everything they said and did not participate in meetings with other subgroups.
Depending on the project’s importance and scope, and the difficulty of scheduling meetings with senior business leaders and management, this process took between one and three months to complete, with outliers taking as long as eight months.
CI-Enabled State
The duration of the process and its lack of effective feedback and traceability comprised the focus of the initial performance improvement effort. (There was additional work on requirements gathering and the rest of the SDLC, but in the interests of not revealing too much, this description will be limited to the initial focus areas.)
Since the highest barrier to improved performance was reducing the time required to conduct the initial requirements meetings, the group thought of other ways to gather requirements using CI [Reimagine].
The meetings were held to elicit sponsors’ and users’ thoughts, so the group focused on other ways to gather these thoughts. Eventually, the group realized they could create a small app that allowed users to record their requirements on their phones and submit them to the teams. [Change from analog inputs to digital inputs. Eliminate the requirements gathering meetings.]
The recordings were translated from speech to text using machine intelligence. [Outsource the transcriptions from humans to IMs.] The transcription, in its original language (AI can handle many different languages), was emailed back to the submitter for verification and to close the loop on their inputs. [Change in the outputs to personalized feedback.]
The translated texts were processed by various natural language processing (NLP) technologies and fed into a large language model to transform each requirement into a User Story [Outsource the writing of the stories].
These stories were then post-processed and clustered using other machine learning algorithms. The BRMs and developers reviewed and modified each story cluster to create a unique, comprehensive set of requirements [Augment the identification of the requirements.]
The stories were assigned unique IDs and tagged by the name of each sponsor/user who submitted some form of the final request. The requirements “database” was made available to all sponsors/users, who could filter on their names or groups [Change – new outputs].
Each future sprint included the IDs of the stories, improving the quality of requirements traceability [Change – new output].
The initial requirements were collected and processed in two days in the first live use of the Ci-enabled process. Since the sponsors/users were keen on seeing the improvements, they were very engaged. In the future, as the novelty wears off, it will most likely take longer for sponsors/users to submit their requests, but not too much longer, and certainly not three months.
There is much more to this topic. For example, design choices become more challenging when determining whether to assign decision-making or policy-making and enforcement to CI, and effective and accurate human performance measurement is more challenging when humans work side-by-side with IMs. Future articles will explore these issues.


