Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.



Notes/ Action Items


Action Items

  1. Following today’s discussion, Michael to provide further information over the list regarding the use of both the general email address and alternates.
  2. ICANN Support Staff to create a google doc where team members may provide draft questions to ICANN Compliance and ICANN Legal. Please find the link here:
  3. ICANN Support Staff to include a list of EPDP Phase 1 purposes for processing registration data in the index.
  4. RDA Scoping Team members to review input provided in Assignments #1 - #4 and provide additional input by COB Wednesday, 20 October.
  5. NOTE: ICANN72 Registration Data Accuracy Scoping Team session [] will be on Tuesday, 26 October at 12:30 PDT/19:30 UTC.

Registration Data Accuracy Scoping Team – Meeting #2

Tuesday 12 October at 14.00 UTC

  1. Welcome & Chair Updates (10 minutes)
    1. Results of doodle poll & schedule of meetings going forward
  • As of now, Thursdays at 13:00 UTC is the preferred time.
  • For the individuals who noted that this time would not work, they have confirmed that they can now make this time slot work.

               b. Miscellaneous administrative issues

  • Shortly before the meeting, Michael sent a message to the group re: use of alternates and a general email address.
  • Proposal to use alternates – if the group goes forward with this path, please consider appointing an alternate from South America
  • Secondly, the email includes more details about the proposed email list for third parties to submit ideas to. ICANN org colleagues will monitor the email list.
  • If there are no objections to these two items, propose to move forward.


      • With respect to the general email address, no objection; however, concerned that having an open-ended email will not generate useful input. Unsure what the actual value of this would be. There may be better ways to get more productive feedback from the community. There is generally a public comment period on a preliminary issues report – while this is not a preliminary issues report, perhaps a public comment period could be helpful here.
      • Once the team has something on paper, we could seek input on that
      • Need to understand how to consider and incorporate any input received before establishing the email box
      • How well is the example working (ICANN ODP)?
      • The SSAD ODP email requires users to click a link before sending an email. There is a spam filter on the message and admins can review messages for inadvertent spam. The list will be publicly archived so that all can see what is received. This could be considered similar to soliciting input from SOs ACs at early stages
      • This seems like a low cost and low risk thing to do – if it proves to be horrible, the group can dismiss the email address. There is no harm in trying this.
      • This is a good way to solicit input. Perhaps to limit spam, there could be a dedicated platform instead of email.
      • With respect to alternates, will work be assigned to alternates? This could be a way to abuse the size limits of the scoping team. There is value to having a finite number of members to this scoping effort.
      • These two items will be put on hold, and Michael will respond with further information over the list.

        2. Background briefing assignment #1 - see https://docs.googl‌[]

         a. Review assignment #1

      • Feedback from registrars – the EPDP did change some of the documents; for example, some of the data elements will no longer be collected. It will be helpful for this team to keep this in mind.
      • ICANN org prepared a document about the impact of EPDP.
      • Need to review the purpose of the accuracy program – groups see this differently

         b. Consider questions and team input received:

                   * What information is missing from the list of existing accuracy obligations?

      • There is no point in asking for information if there is no level of confidence that the data is accurate.
      • The Board noted that it will further consider pending RDS-RT recommendations following consideration of Phase 2.
      • This needs to be a fact-gathering exercise – for example, what are processes that ICANN has started in the past but has not concluded?
      • In EPDP Phase 1, the Team identified seven purposes for the processing of gTLD registration data. These should be included in the information index.
      • Any relevant information that needs to be pulled out for a specific conversation can be pulled out. RDA Support Staff will pull out the purposes.

                 * What follow up questions are there in relation to the information that has been provided on the measures used by                         ICANN by ICANN Compliance to monitor, measure, enforce and report on the accuracy obligations?

      • Should the team come up with a list of questions to ICANN org/compliance about how and if their job is difficult?
      • Inviting compliance to come speak with the team would be helpful – the first charge of the scoping team is very relevant to ICANN compliance’s role vis-à-vis accuracy.
      • This is not only a good idea; it is a necessary idea. The liaison does not act as an advocate; they are a conduit. ICANN is a major player in the concept of accuracy. Things have changed radically over the past few years.
      • Would recommend reviewing the information already provided and noting what is missing still needs to be asked. ICANN Support Staff to create a google doc for compliance questions. In the document, include a link to the blog of already-answered questions

                  * In relation to the materials provided concerning definitions, are there references sources  missing?

                  * What working definition should be used and why?

      • Section 3.7.7 references the term “commercially reasonable” in reference to compliance. This is a term of art and requires further discussion.
      • Section 4.6 of the ICANN Bylaws should be considered and included in the list of resources – law enforcement, consumer protection, etc.
      • Recommend to include the document prepared by Steve Crocker providing comparison of SAC058 to ERI (aka Project Jake), NIST and eIDAS standards sent to mailing list on 5-Oct-2021. There is a column that applies specifically to validation.
      • Do not understand the purpose of looking at this chart at this point in the call.
      • Team is reviewing the document and looking at input on definitions.
      • The purpose of this chart – suggest that this is a way of representing what the validation requirements are and one can argue if these are the right settings or not. Relevant point – is this a useful method for capturing what the requirements are or for having competing versions of what the requirements are.
      • This maps the definition of what accuracy is from an engineering perspective rather than a legal perspective.
      • This document is a valid interpretation of the SSAC document.
      • Trying to get the team to a definition via an operational way is helpful. However, the team is supposed to be defining rather than rating.
      • As far as registrars are concerned, the definition of accuracy is built into the RAA. The definition, therefore, is very straightforward.
      • From a registrar standpoint, the accuracy definition comes from the 2013 RAA and it outlines what the definition of accuracy is – particularly the Whois Accuracy Program Specification.

             c. What next steps and approach should the team take to address this assignment?

            d. Confirm next steps

       3. Background briefing assignment #2 – see‌om/document/d/1OyzzAjZgvNkfZ5EekUvJ7PQg80vNZvJ3/edit []

           a. Review assignment #2

      • Steve Crocker has already shown how the SSAC document could be applied
      • Draft Report for study of WHOIS Accuracy – there was no further step on finalizing the report. The draft report was used to help inform the AOC review as well as WHOIS policy development work, but there was no final version that was produced.

           b. Consider questions and team input received:

  • What information, if any, is missing to support the team’s deliberations on recommendations for how accuracy levels can be determined and measured?
      • In light of GDPR, the WHOIS system has changed. The ARS system had to stop working; how it was working was no longer valid.
      • Homework – next week, when we reengage on assignment #2 – registrars are noting that ARS is no longer fit for purpose. Question for the rest of the group – do you agree with this statement? Is there a way to move forward with the ARS that is GDPR compliant?
      • Fundamental concern – this doesn’t work and is no longer appropriate – not sure that is the job of the scoping team. Rather than making evaluations, the team should be defining accuracy.
      • A tool the team used to have is the ARS – is that an arrow still in the Team’s quiver? Perhaps the team needs to do some fact finding on this to determine if this is a valid arrow. Important to engage in fact finding.
      • Agree that ARS is no longer functional; would be helpful to ask ICANN legal about the viability of the ARS system.
      • ICANN org did provide additional information regarding the current suspension of the ARS. The Team will have a document in which the team may ask questions of ICANN compliance – perhaps questions can also be catalogued here for ICANN legal
      • The ARS as implemented is dead. ICANN made no attempts to implement minor or major changes to make this work – such as using escrowed data.

                                * What approach should the team take to develop these recommendations?

                                 * What information, if any, is missing to support the team’s deliberations on whether ARS needs a revamp or whether there are other ways in which accuracy                                        accuracyb  levels can/should be measured?

                                * What approach should the team take to develop these recommendations?

                                * How much time and resources are expected to be needed to either revamp ARS or implement other ways in which accuracy levels can/should be measured?

          c. Confirm next steps

      4. Background briefing assignment #3 – see‌d/1NiwMk6qHOQRn7VdcW0Paj5OoC3tWAQpm/edit []

             a. Review assignment #3

             b. Consider questions and team input received:

                    * What is necessary to undertake such an analysis?

                     * What is the definition of “effective”?

                     * How are “accurate and reliable” to be interpreted (see also assignment #1 re. working definitions)?

             c. What next steps and approach should the team take to address this assignment?

            d. Confirm next steps


     5. Background briefing assignment #4 - see []

            a. Review assignment #4

            b. Consider questions and team input received:

                     * When & how are estimates of benefits and costs expected to be developed?

                      * In addition to outcome of assignment #1-3 and cost/benefit analysis, is there anything further that is needed for the scoping team to deliberate on whether any changes are needed to improve accuracy levels? 

                       * Based on response to previous question, what are the options the team can consider for how and by whom these changes would need to be developed?

            c. What next steps and approach should the team take to address this assignment?

            d. Confirm next steps


      6. Confirm action items & next meeting (TBC based on responses to doodle poll - [])