General Observations

The Multistakeholder Strategy and Strategic Initiatives Department (MSSI) developed a survey which was sent to the Competition, Consumer Trust and Consumer Choice (CCT) Review Team.  The objective of the survey was to gather feedback after the Review work concluded.  

Nine (9) of the fourteen (14) review team members responded to the survey. Responses to each question of the survey were not mandatory and are indicated as percentages, along with the actual number of responses in parenthesis. 

Cells in pink highlight the answers with the highest number of responses.  All comments appear as submitted, without editing.

Questions Pertaining to ICANN Organization

Number of Responses1 (least)2345 (most)
How satisfied were you with the support and guidance provided by the ICANN organization during the review?8 total0.00%0.00%0.00%75.00% (6)25.00% (2)
How effectively did the ICANN organization communicate key developments pertaining to the review to the community?9 total0.00%0.00%

11.11% (1)

55.56% (5)33.33% (3)
How satisfied were you with the level of information provided in implementation related briefings?8 total0.00%0.00%25.00% (2)50.00% (4)25.00% (2)
How satisfied were you with Constituency Travel and FCM services throughout the review?9 total0.00%0.00%44.44% (4)22.22% (2)33.33% (3)
How responsive was ICANN organization to your requests for information/clarification?9 total0.00%0.00%0.00%55.56% (5)44.44% (4)
How satisfied were you with support and guidance provided by the ICANN organization during the CCT Review?9 total0.00%0.00%22.22% (2)55.56% (5)22.22% (2)

How satisfactory were the tools provided to conduct the review?

Number of Responses1 (least)2345 (most)
Templates9 total0.00%0.00%44.44% (4)33.33% (3)22.22% (2)
Community Wiki9 total0.00%22.22% (2)11.11% (1)44.44% (4)22.22% (2)
Conference Calls9 total0.00%0.00%11.11% (1)66.67% (6)22.22% (2)

The survey allowed respondents the opportunity to provide additional comments about improvements, if any, to the ICANN organization support.  Below are the comments received from respondents.  [All comments appear as submitted, without editing.]

The assistance of the MSSI Team cannot be overvalued. Special mention of REDACTED
Management of the overage timeline
The CCT Review Team would have gotten off to a quicker start if there had been clearer guidance in the beginning about what data was available and what data was missing with regard to the research areas. It took several months for the CCT Review Team to figure out where data may exist within ICANN (or elsewhere). ICANN Support Staff was fantastic but some ICANN staff members were not very responsive to CCT RT data requests in the beginning. REDACTED, in her prior role, for example, did not respond to multiple requests. Perhaps more internal communication about expectations for ICANN staff to respond and help would be helpful. Every ICANN staff member who was assigned to the CCT RT was fantastic.

It would be helpful for ICANN org to be able to figure out how to encourage broader participation from participants. We were really slowed down by being stuck with just a few people contributing the vast majority of the work.
Staff could probably help with more content and coordination to/from interested groups.

The process began with the notion that the team should take more responsibility with respect to the budget but the budget process was not fully discussed. I'm supportive of a team being aware and respectful of the budget but when a process crosses fiscal years, real spending prioritization needs to take place and it didn't.
Email and the wiki are both fairly ineffective ways to collaborate. The world of collaboration technology has really progressed in the past 5 year and the IT department needs to empower review teams with better collaboration tools.

The most egregious problem was the availability of relevant data and information. Some of that is currently not available in the ICANN organisation but with third parties. Data sharing arrangements with third parties should go a long way to cure this problem.
Additional comments about support and guidance provided by the ICANN organization during the CCT Review. Below are the comments received from respondents.  [All comments appear as submitted, without editing.]
It would be helpful for ICANN staff to ensure that a staff member has institutional knowledge of where all existing data sources to help a review team are located. Therefore, this sort of information could be provided on day one of a review team meeting.
I think our staff did a fantastic job. They provided just the right amount of support and encouragement to drag the beleaguered review team over the finish line. I have nothing but praise for our support staff. The deficiencies are two folks and institutional rather than personal. First, as noted above, better tools for collaboration are necessary which requires better support from IT. Second, the management of reviews, implementation plans and overlapping efforts needs to be handled differently. These reviews need to be treated as critical path for further implementation and reform efforts, not parallel academic exercises.

Questions Pertaining to the Board Caucus Group

Number of ResponsesNo Opinion1 (least)2345 (most)
How satisfied are you with the depth of input provided by your Board Caucus Group?9 total22.22% (2)11.11% (1)0.00%33.33% (3)22.22% (2)11.11% (1)
How effectively did you communicate with your Board Caucus Group?9 total22.22% (2)0.00%11.11% (1)33.33% (3)33.33% (3)0.00%

Number of ResponsesYesNoPlease Explain
Did you find consultations with the Board Caucus Group useful?8 total62.50% (5)25.00% (2)12.50% (1)
Additional comments/suggestions to improve collaboration between you and the Board Caucus Group?   [All comments appear as submitted, without editing.]
The Board Caucus Group was helpful. However, more meetings should have taken place and over a longer period of time to ensure that it was a productive, ongoing relationship. Also, some members of the Board Caucus Group could have been more engaged during the meeting so as not to fall asleep or be distracted by phones. One Board member seemed to coordinate with a CCT member who resigned to deliver a theatrical confrontation with the CCT, the moment the CCT found out that the member had resigned. This was not very productive.
Seemed like we got very little helpful input or guidance from the Board Caucus Group once it was created.
This falls into a broader category of the role of reviews. Ideally, the collaboration with the board caucus group should be such that there are no surprises from he board when recommendations are placed in front of them.
Certainly more timely and routine information sharing

Questions Pertaining to the CCT Review Team

Number of Responses1 (least)2345 (most)
How effective was the CCT Review Team at providing status updates on a regular basis to the SOs/ACs and the broader ICANN community?8 total0.00%0.00%25.00% (2)25.00% (2)50.00% (4)
How effective was the CCT Review Team at incorporating feedback received from the ICANN community?8 total0.00%0.00%12.50% (1)37.50% (3)50.00% (4)
How satisfied were you with the duration of the overall review and the overall effort of the review team?8 total25.00% (2)0.00%25.00% (2)50.00% (4)0.00%
How satisfied were you with the application process?8 total0.00%0.00%12.50% (1)62.50% (5)25.00% (2)
How satisfied are you overall with the conduct of the review?8 total0.00%12.50% (1)12.50% (1)62.50% (5)12.50% (1)

Number of ResponsesYesNoPlease Explain
At the start of the review, did you feel confident about your objectives, tasks and involvement needed as a review team member?8 total75.00% (6)12.50% (1)12.50% (1)
In order to continuously improve the review process, do you think that the review would have benefitted from the use of a temporary facilitator to assist with developing the scope and workplan, determining leadership, and overall conduct of the review?8 total50.00% (4)50.00% (4)0.00%
Comments on the assessment of the CCT Review Team. Below are the comments received from respondents.  [All comments appear as submitted, without editing.]
A core group of review team members did nearly all of the work. Some members did not do more than dial in to calls. This was a bit frustrating during some of the more onerous parts of the review. However, the core group worked together really well and the broader group was always collegial and genuinely cared about the issues.
There was a core group that did most of the work, a second tier that contributed from time to time if nudged, and a last category of folks who did very little.  This was somewhat frustrating.
This took way too long, and it was hard to stay in sync with the community over the duration (there was much more engagement early on).
We should have organized around our budget and around community priorities better perhaps so that fewer items were OBE (overtaken by events) in the course of the review. The review perhaps took too long which was also a function of prioritization of activities, particularly outside research that could have been happening in parallel more often.

Additional Comments

Comments on how the review was conducted (what worked well, suggestions for improvements, etc.?), are below.  [All comments appear as submitted, without editing.]
Not to divide the review into 2 separate parallel mini reviews
The core group really did a heavy lift to ensure that the commitments of the review team were fulfill and a data-driven assessment of the new gTLD program occurred
There was a separation between the information sought prior to the review started and the information that the Review Team itself thought would be useful.  This resulted in a very inefficient use of resources, and extended the time needed for this review.  Ideally, the review team would be involved from the start of the process to asses what studies would be useful to carry out its mandate.
We got a good start, but eventually were extremely bogged down with very little progress in the second year of the review.  We probably just should have accepted some limitation in scope and done what we could on the first pass.
It's noted pretty clearly in the final review document but the review is sorely lacking in data in a number of important areas.Data has got to become a bigger priority inside ICANN generally.
Success of these reviews rests on information access and sharing. I find that the substantive analysis rests on discussion. Those discussions work well and advance more quickly to consensus when team members are face-to-face.
What improvements, if any, would you suggest to make the construct of Specific Reviews more effective?  [All comments appear as submitted, without editing.]
More data, sooner. Also, it is important to continue to allow for independent experts to be appointed. The CCT Review team was fortunate enough to have an economist and cybersecurity expert because they were not required to be appointed by an official ICANN community.
Is there a way to assess whether members will actually do the work?
Probably don't allow an mid-stream addition to the report, and force fit to some timelines rather than allowing the desire for more information to extend out the duration of the review.

1. Earlier involvement of board caucus group to access priorities
2. Activity prioritization based in budget schedule
3. Better collaboration tools
4. Better data collection and community support

Information access and sharing, especially with third party entities are always going to help the team to move more rapidly to conclusions.
Additional comments shared about the CCT Review are below.  [All comments appear as submitted, without editing.]
It is important to ensure that those who are selected are ultimately committed to doing the work and spending the time to engage on the issues. ICANN has some fantastic project managers, but it is difficult for a small core group to do so much work without other review team members contributing. Also, on another note, at times, the travel could have been communicated earlier.
The contracted parties hold a much too protected status inside ICANN to be effectively reviewed and their processes reformed. There's a lot of work to be done.
  • No labels