The call for the Applicant Support GGP team will take place on Monday, 19 December 2022 at 15:00 UTC for 60 minutes.

For other places see: https://tinyurl.com/3d56vrpk

PROPOSED AGENDA


  1. Welcome & Updates to Statements of Interest

       2. Task 1 (15 min.):

  • Briefing from GDS on the Implementation of the 2012 Applicant Support Program (10 min.)
  • Q&A (5 min.)

       3.Task 2 (15 min.) – see attached document:

  • Discussion and finalization of revised draft
  • Discussion of timing

      4.Introduction to Tasks 3, 4, and 5 (30 min.) – see attached document

      5.AOB

BACKGROUND DOCUMENTS


GGP Applicant Support Task 2 Input Request 05 Dec 2022

GGP Applicant Support Tasks 3-4-5 

GGP Applicant Support Work Plan & Timeline for Council.pdf

PARTICIPATION


Attendance

Apologies: Rubens Kuhl 

RECORDINGS


Audio Recording

Zoom Recording (including audio, visual, rough transcript and chat)

GNSO transcripts are located on the GNSO Calendar

Notes/ Action Items


ACTION ITEMS/HOMEWORK:

TASK 2:  1) Staff to accept the changes in the revised Input Request and circulate it to the WG; 2) Staff to send on behalf of the WG to the Advisory Committees/Supporting Organizations, Stakeholder Groups/Constituencies by mid-January 2023.

TASKS 3, 4, and 5: 1) Staff to circulate the framework documents to aid discussion on Tasks 3, 4, and 5, and 2) WG members are requested to provide feedback.

 

Notes:

  1. Welcome & Updates to Statements of Interest (5 min.)


2. Task 1 (15 min.):

  • Briefing from GDS on the Implementation of the 2012 Applicant Support Program (10 min.) – see attached slides
  • Q&A (5 min.)
  • Hoping to build something that will improve the uptake.
  • In 2012 if you applied for the program and didn’t make it you couldn’t go through the regular application process.
  • Questions: 1) Interested to know if you had some metrics how to track KPI for the application program; 2) How did it work for the pro bono support; 3) what are the lessons learned? Answers: 1) Don’t think there were KPIs, but we will check; 2) pro bono consisted of a directory of services; 3) a lot of the Final Report outputs address lessons learned.  Good improvement would be to start outreach early. Hard to say what success looks like.  Increasing % of applicants requesting support is one.
  • Question: 1) With these criteria, how to ensure geographical distribution? if this was an objective of the ASP? 2) Were they taking into account the motivation for the global south?  Answer: Least developed countries was the target, in the Final Report it was “struggling regions”.  Could consider to measure geographic criteria.
  • Don’t think the criteria were well-enough established – make sure for future rounds that we don’t leave it open, but make it really clear.  Make sure we provide guidance
  • Question: What was the difference for .KIDS qualifying?  Answer: The other two did not make it through on at least one criteria.  See: https://newgtlds.icann.org/sites/default/files/sarp-results-20mar13-en.pdf.
  • Don’t think you can have a single gating criteria – needs to be a matrix. 
  • Question: Was there any evaluation of the SARP as a mechanism?  Answer: Not aware of an evaluation, except it didn’t achieve the objective.
  • Don’t think this group is expected to review the previous process.

3. Task 2 (15 min.) – see attached document:

  • Discussion and finalization of revised draft
  • Discussion of timing
  • WG had as homework to review the revised draft and provide comments on the list, if any, prior to this meeting. No comments were received.  Nor were any comments aprovided on this call.
  • For the timing to send the Input Request, WG members agreed that the second week in January 2023 would be appropriate with a deadline for a response in two weeks.

ACTION ITEM: 1) Staff to accept the changes in the revised Input Request and circulate it to the WG; 2) Staff to send on behalf of the WG to the Advisory Committees/Supporting Organizations, Stakeholder Groups/Constituencies by mid-January 2023.


4. Introduction to Tasks 3, 4, and 5 (30 min.) – see attached document

  • Could be used to help evaluate the program.
  • Have some proposed metrics and a requirement to prioritize.
  • My understanding is that these are relating to the evaluation of the program, but I think they are a few of them that certainly can add to recommendations around criteria for scoring applicants. It's not as to us to score, but we can make recommendations in terms of how we score.
  • Question: Is this a fixed list of metrics? Answer: Not a fixed list – some suggestions in Task 3 for evaluating the program; Task 4 is qualifying criteria. 
  • We are obliged to consider 17.9, and if we're going to reject anything that's currently in 17.9, we need to explain why it's not appropriate, but we can certainly add to that list.
  • Question: We have the list of those elements of the applicant support program: outreach, education, business case, development and application, evaluation. Can weadd more elements there? Or, if that's fixed, at least we cannot. Answer: We have to consider what is on the list, but we can add elements.
  • Tasks 3 and 4 – General feedback from ALAC: 1) It’s hard to prioritize the metrics in 17.9 if we don't have information on what success looks like. So the recommendation from ALAC is to have a bit more granular information, some objectives and numbers. And if you look at, for example, 17.9, it could be X number of outreach events targeting X number of potential applicants so that it makes it easy for us to know what's important for our community, and it also makes it easy to evaluate the metrics after the fact.  2) Portfolio applicants: There was a question around putting a cap on the number of applications from a portfolio applicant. So we didn't agree, but it's something that I wanted to put on the record from from our working group.
  • I believe we are working on evaluating the program and setting metrics to be measured, not working on the evaluation of application per se.
  • Regarding the definition of "public interest benefit".  Is this defined somewhere?  for success metrics, how much weight is it given compared to the financial and geographic considerations?
  • It's time to start potentially putting up a straw proposal. I don't think that there is any one criteria that is more important than the other. So, for example, Gabriela raised the concern about geographic diversity and the global South. Paul then asked the question is it a country, you know. So how do you?  For example, a very wealth business from a least developed country as compared to a community organization from a very poor part of an otherwise wealthy country. Neither is the perfect applicant, but neither should be disqualified. Rather we should be making recommendations as to a weighting and scoring system, which firstly recognizes that by getting both applications in we're achieving success because it means we've got more than 3 people who are aware of the program that understand if they qualify or not, and decide to make use of the program because it's not punitive, where you're all in or all out. And then the next stage is to provide some guidance in terms of how the evaluation would work, so that we get towards outcomes that are more desirous than others. But the first is, we need applications to come in from people who are generally qualified in the broadest sense.

ACTION ITEM: 1) Staff to circulate the framework documents to aid discussion on Tasks 3, 4, and 5, and 2) WG members are requested to provide feedback.


5. AOB: None.

  • No labels