ICANN recently established the Qualified Launch Program (QLP), which enables new gTLD registries to issue domain names to certain parties to help promote awareness that a new TLD is launching. These domains are somewhat like “anchor tenants” in a new shopping center – their presence makes the location attractive to other businesses.

A number of registries expressed interest in this type of program. ICANN worked to deliver without losing sight of intellectual property protections. The solution we arrived at was to permit registries to use a limited number of names in connection with registry launch activities, so long as those names did not conflict with the rights protection mechanisms required by the Registry Agreement.

How does the Qualified Launch Program work?

The QLP provides that a registry can give up to 100 names to third parties before Sunrise (a rights protection mechanism) if the name:

  1. is not on the list of Sunrise-eligible labels in the Trademark Clearinghouse (the “Sunrise List”)
  2. is on the Sunrise List, but is given to a Sunrise-eligible rights holder
  3. is a public authority or place operated by a public authority

Why is the QLP being introduced?

The goal of the QLP is to give new registries additional tools as they open up their doors for business. Having a program generally available will also be beneficial in being more efficient than asking registries to apply individually for Approved Launch Programs.

How was the QLP introduced?

As with many other ICANN endeavors, we first developed a draft of the process, then published it and solicited community input. After analyzing comments and concerns, ICANN established the Qualified Launch Program and it has been automatically incorporated into the Rights Protection Mechanism Requirements [PDF, 167 KB].

The QLP is essentially the same as the draft version. Here is an overview of the updates:

  1. Slightly revised definition of names that can be given to governments/public authorities
    Comments suggested some revisions to this category, for example, incorporating “subdivisions or districts” of a city or region, which have been incorporated as this is in keeping with the intent of the draft. We have also made a clarification that these names can be in any language provided they fit the requirements.

  2. “Public authority” category of names made available to any TLD
    Originally, only names identified as “geographic” according to the Applicant Guidebook definition were eligible under the public authority category.Community feedback suggested that eligibility should not be limited to those TLDs that fell within the Guidebook definition, as a broader set of registries might also have a geographic orientation. Given what appears to be a low risk that governments or public authorities will use the QLP to circumvent rights protections, ICANN updated the QLP to allow any registry to give out names to these groups.

How do I use the QLP?

All registries may assign names, as provided by the QLP, after following the steps below:

  1.  Notify ICANN of the intention to use the QLP, by either submitting or updating the registry’s TLD Startup Information.
  2. Obtain a Sunrise List, including all the labels associated with a Signed Mark Data (SMD) file, from the Trademark Clearinghouse.
  3. Check the names you would like to assign against the list. You may give out QLP names to:
    1. Sunrise-eligible rights holders
    2. Public authorities
    3. Anyone, IF the name isn’t on the list AND you have distributed less than 99 other names

We hope that the QLP will be a useful tool for registries to use in launching their TLDs.

Further Reading


About a year ago, The Spamhaus Project was victim to what was then considered the worst to date DDoS attack. It directed DNS response traffic at a rate of nearly 300 gigabytes per second against Spamhaus’s name servers flooding them, making them unable to resolve requests for www.spamhaus.org and making www.spamhaus.org appear down to anyone unable to resolve the name.

To do this, the attackers exploited a vulnerability in the Domain Name System (DNS). They directed their attack against DNS resolvers and against the authoritative name servers set up by the registrant (in this case, Spamhaus) for the regular operation of their own domain name. While the threat that this type of attack represents comes from the misuse given by attackers to addressing resources, its mitigation rests in improving address management practices at the network edge (see 1.4, SAC 004 [PDF, 8 KB].)

DNS DDoS Attacks Fundamentals

The attackers used three techniques – IP spoofing, reflection, and amplification – to launch this massive attack against Spamhaus. They coordinated to send extraordinary numbers of DNS queries to around 30,000 DNS servers. In each of these queries, the source address was spoofed or set to the address of Spamhaus’ DNS server. This caused the 30,000 DNS servers to believe that their responses had to be sent to the Spamhaus’ DNS server (reflection). And, to make things worse, the attackers made a DNS query that would cause all the 30,000 DNS servers to reply with a very large response packet (amplification.)

Is there a solution?

Back in the 90s, Paul Ferguson and Dan Senie predicted the enormous potential of harm that IP spoofing represents and published, via the Internet Engineering Task Force (IETF,) a draft of the document that, after a long process and several years of discussion, ended up becoming BCP 38 (What is a BCP?)

BCP 38 describes the solution (or a defense, depending on who you ask) to IP address spoofing: That all Internet service providers and operators of Internet infrastructure implement a technique called Source Address Validation in their servers. If an ISP has implemented BCP38, when it receives an IP packet claiming to come from an IP address that’s not within the range of addresses that have been assigned to it, the ISP would block the packet and not let it through.

BCP38′s implementation

BCP38 greatly reduces the ability to create attacks of such scale as the one launched against Spamhaus. What’s unfortunate is that, although the first draft of the document was published in 1997, the measures have not yet been implemented widely by most ISPs throughout the world.

Because of the low level of implementation of BCP38, I wanted to ask some questions to Paul Ferguson, one of its co-authors, who very kindly shared his view for this post:

Carlos Alvarez (CA): You are one of the 2 authors of BCP38/RFC2827. How frustrating has it been for you that it’s not been widely implemented, although the document was first published so very long ago?

Paul Ferguson (PF): We (my co-author, Dan Senie, and I) actually published the document as an IETF draft in 1997, and it later became RFC2267 in 1998 (what is an RFC?,) obsoleted by RFC2827 in 2000, and later became a BCP (Best Current Practice), specifically BCP38, later in 2000.

How frustrated am I? Only slightly. The Internet is a *very* big thing, and it is not necessarily reasonable to expect complete compliance amongst all participants, especially on an architectural issue which is voluntary, and that is what BCP38 and it’s anti-spoofing recommendations are: voluntary.

Having said that, there is *no* legitimate reason to allow source-spoofed IP traffic on the Internet. Period. Full stop. Networks which allow source-spoofed IP traffic are basically allowing criminal activity within their own networks. We should encourage the entire Internet to embrace S.A.V.E. or Source Address Validation Everywhere.

CA: How do you think we could have all the ISPs and other Internet infrastructure operators implement source address validation? Should we, as some suggest, look for new regulation (Dave Piscitello‘s excellent article on regulation and DDoS attacks can be read here) making it mandatory everywhere?

PF: I would much rather prefer that the ISP community police itself and voluntarily implement these measures instead of governments and legislatures getting involved. We need to do this before someone else tries to force us to do it, and we end up with something draconian or technically impossible.

Having said that, getting ISPs to do this voluntarily does not seem to be working, so perhaps some small amount of legislation or regulation might be needed. I run hot and cold on this issue.

CA: Merike Kaeo (who is Merike?) recently shared that during an event she asked the Finnish why they are so good at BCP38. Their answer was gracious in its transparency: ‘We implemented BCP38 because our regulator said it is a good idea.’ So, if regulation is not the perfect answer, and it of course will never be the solve-it-all solution, should we replace all the employees of every operator of Internet infrastructure with Finnish folks?

PF: With regards to the Internet, I’ve always been of the opinion that “Let a thousand flowers bloom” is the appropriate attitude. We welcome diversity, and we should embrace it. But at the same time we should also embrace the idea of a CDC (Center for Disease Control) model for the Internet Infrastructure. If we all don’t agree to some basic guidelines to support the health of the Internet, someone may come along and simply burn it down as they like, and that’s what these DDoS attacks represent.

Some recommendations regarding Source Address Validation

Paul Vixie (who is Vixie?), in a conversation he recently had with a reporter, made some suggestions on how to seek wider implementation of BCP38 that can be worth discussing, such as including civil penalties for contributory negligence when a non-Source Address Validated network is used as a DDoS launch point that causes harm, and ISO 9000 or 27000 terms of reference for this, so that buying insurance is harder if networks are not doing Source Address Validation.

To complete these, it is necessary to make reference to the document SAC065 [PDF, 423 KB], published by ICANN‘s Security, Stability and Resiliency Advisory Committee, that contains recommendations that should be adopted by all ISPs and Internet infrastructure operators all around the globe.

I asked Vixie if he could write some lines to the ISPs and Internet infrastructure operators in Latin America. I extracted these, that are applicable to folks all around the world: “Source Address Validation is a little more work for network operators, but the downstream benefit for the overall economy and for human society is well worth that extra work.”


Hopefully the ISP industry and all the other operators of Internet infrastructure will soon implement – voluntarily – Source Address Validation. If they don’t, the frequency and size of the DDoS attacks will continue to grow, increasing the likelihood of regional Internet blackouts, and increasing the likelihood that national governments enact regulations that may not necessarily be operationally or technically desirable.

Special thanks to Vixie and Fergie for their kind contributions to this article.

Greetings from California,

Carlos S. Alvarez
SSR Technical Engagement Sr. Manager
Security Stability Resiliency Team


Hace ya un año que The Spamhaus Project fue víctima del que, en su momento, fue considerado el más grande ataque distribuido de denegación de servicios. Los atacantes lograron enviar 300 gigabytes por segundo contra los servidores de nombres de dominio de Spamhaus, haciendo que éstos no pudieran responder las solicitudes de resolución del nombre www.spamhaus.org y haciendo que el sitio web pareciera estar caído a quien tratara de visitarlo al no conseguir la resolución del dominio.

Los atacantes explotaron una vulnerabilidad en el sistema de nombres de dominio (DNS). Ellos dirigieron el ataque contra los resolutores de DNS y contra los servidores de nombres de dominio autoritarios definidos por Spamhaus para la operación normal de su dominio. Mientras que la amenaza que esta clase de ataques representa es dada por el mal uso que los atacantes dan a los recursos de direccionamiento del DNS, su mitigación consiste en el mejoramiento de las prácticas de administración de direcciones en lo que se conoce como el ‘network edge‘ (ver 1.4, SAC 004 [PDF, 8 KB]).

Fundamentos de los ataques de DDoS a través del DNS

El DNS está compuesto por una cadena de queries enviados desde el navegador del usuario hasta un servidor que tiene la autoridad para responder. Simplificando el proceso, si usted quiere visitar ‘ejemplo.com’, su navegador enviará un query a un servidor de DNS preguntándole la dirección IP en la que ‘ejemplo.com’ está alojado. El servidor de DNS buscará la respuesta al query y la enviará de vuelta.

Los atacantes usaron tres técnicas – IP spoofing, reflexión y amplificación – para disparar este ataque masivo contra Spamhaus. Ellos coordinaron el envío de un número extraordinario de queries a alrededor de 30.000 servidores de DNS. En cada uno de estos queries la dirección IP estaba falsificada (spoofing) para aparentar que habían sido enviados por el servidor de DNS de Spamhaus. Esto hizo que los 30.000 servidores de DNS creyeran que sus respuestas debían ser enviadas a ese servidor de DNS de Spamhaus (reflexión). Y, para empeorar las cosas, los atacantes enviaron los queries de manera que las respuestas enviadas por los servidores de DNS a Spamhaus fueran paquetes muy grandes (amplificación).

¿Existe una solución?

En los 90, Paul Ferguson y Dan Senie previeron el enorme potencial de daño que representaba esta vulnerabilidad y publicaron en 1997, a través de la Internet Engineering Task Force o IETF, un borrador del documento que, luego de un largo proceso y varios años de discusiones y desarrollo, terminó convirtiéndose en el BCP 38 (¿Qué es un BCP?).

El BCP 38 describe la solución a esta vulnerabilidad (o una defensa, depende de a quién se pregunte): que todos los proveedores de servicios de Internet implementen una técnica denominada Source Address Validation. Si un ISP ha implementado el BCP 38, al recibir un paquete IP que diga venir de una dirección IP por fuera de los rangos de IPs que le hayan sido asignados, su servidor lo bloqueará y no lo dejará pasar.

Implementación del BCP 38

El BCP 38 reduce en gran medida las posibilidades de crear ataques de la misma escala del que fue dirigido contra Spamhaus. Lo paradójico es que, a pesar de que el primer borrador del documento fue publicado en 1997, la técnica aún no ha sido implementada por la gran mayoría de ISPs en el mundo.

Por el bajo nivel de implementación quise hacerle algunas preguntas a Paul Ferguson, uno de sus coautores, quien muy amablemente compartió su opinión para este artículo:

Carlos Álvarez (CA): Tu eres uno de los dos coautores del BCP 38. ¿Qué tan frustrante es ver que aún no ha sido ampliamente implementado, a pesar de que fue publicado hace tantos años?

Paul Ferguson (PF): Nosotros (Dan Senie y yo) publicamos el documento como un borrador de la IETF en 1997, luego se convirtió en el RFC2267 en 1998 (¿Qué es un RFC?), que fue a su vez derogado por el RFC2827 en el año 2000 y que en ese mismo año terminó convirtiéndose en el BCP 38.

¿Qué tan frustrante es para mí? Sólo un poco. Internet es algo *muy* grande y no es necesariamente razonable esperar que todos los que participan en él cumplan con su deber, especialmente en un tema de arquitectura que es voluntario.

Sin embargo, no hay una sola razón *legítima* para permitir tráfico en Internet que falsifique la dirección IP desde la que éste proviene. Punto final. Las redes que permiten esta clase de tráfico están permitiendo que haya actividad criminal en su propio terreno. Deberíamos invitar a todo el Internet a implementar SAVE o Source Address Validation Everywhere.

CA:¿Deberíamos, como algunos sugieren, buscar nuevas regulaciones y hacer de esto una obligación en todas partes? (El excelente artículo de Dave Piscitello sobre regulación y DDoS se puede encontrar acá).

PF: Preferiría mucho más que sea la comunidad misma de ISPs la que se vigile a sí misma e implemente voluntariamente estas medidas, a que sean los gobiernos y los congresos los que se involucren porque pueden terminar imponiendo leyes redactadas por personas no técnicas. Necesitamos hacer esto antes de que alguien intente forzarnos y terminemos con una regulación draconiana o técnicamente imposible.

Por otro lado, conseguir que los ISPs hagan esto voluntariamente no parece estar funcionando, así que tal vez una pequeña cantidad de legislación o regulación se puede necesitar. Estoy indeciso en este asunto.

Deberíamos también apoyar la idea de un modelo similar al del CDC (Center for Disease Control) para la infraestructura de Internet. Si todos no nos ponemos de acuerdo respecto de unas líneas básicas para apoyar la salud de Internet, alguien puede venir y simplemente incendiarlo como le plazca, que es lo que estos ataques de DDoS representan.

Recomendaciones relativas a la Source Address Validation

Hace pocos días Paul Vixie sostuvo una conversación con un reportero en la que expresó algunas recomendaciones relacionadas con la importancia de que el BCP 38 sea ampliamente implementado. Bien vale leerlas acá.

También es necesario hacer referencia al documento SAC065 [PDF, 423 KB], publicado por el Comité Asesor en Seguridad, Estabilidad y Resiliencia de la Internet Corporation for Assigned Names and Numbers (ICANN). El documento contiene una serie de recomendaciones que deberían ser seguidas por los ISPs y los operadores de infraestructura de Internet en todo el mundo.

Finalmente, pedí a Vixie que enviara unas líneas sobre este tema a los ISPs y operadores de infraestructura de Internet en América Latina. Estas son sus palabras:

“Un agradecimiento especial y un saludo a mis colegas que manejan la operación y la seguridad de Internet en América Latina… Ustedes tienen la oportunidad de construir un mejor sistema que el que nosotros tenemos.

“…El Viejo Internet fue construido sobre lo que yo llamo <<un modelo de negocio contaminante>> en el que desechos industriales peligrosos, resultado de nuestras ganancias económicas, son con frecuencia arrojados a la Red y se convierten en un problema para todos los demás.

“Los reto a hacerlo mejor que nosotros. De hecho, los reto a demostrarle al mundo cómo el Internet debió ser construido y cómo puede ser reconstruido.

“Source Address Validation es un poco más de trabajo para los operadores de redes, pero el beneficio para toda la economía y para la sociedad humana bien vale la pena ese poco más de trabajo”.


Ojalá la industria de ISPs y los demás operadores de infraestructura de Internet implementen –voluntariamente– Source Address Validation cuanto antes. Si no lo hacen, la frecuencia y la magnitud de los ataques de denegación de servicios continuarán creciendo, incrementando la probabilidad de que se presenten apagones regionales de Internet y la probabilidad de que gobiernos nacionales dicten regulaciones que no sean deseables desde el punto de vista operacional o técnico.

Un agradecimiento especial a Vixie y a Fergie por sus amables contribuciones para este artículo.

Saludos desde California,

Carlos S. Alvarez
SSR Technical Engagement Sr. Manager
Security Stability Resiliency Team

Nota: la versión en español de este artículo fue inicialmente publicada acá.


Training Wheels Off

by Fadi Chehadé on April 11, 2014

My sons are adults now but I remember like it was yesterday teaching them how to ride their bikes. Removing the training wheels from their bicycles was an important milestone, but it didn’t mean that I was ready to leave them on their own to ride the neighborhood. As they got used to being on two wheels instead of four, I was right there beside them, ready to correct and guide.

I can’t help but see the parallels when I think about the U.S. government’s announcement last month. The U.S. government came to the conclusion that the global Internet community is now ready to assume stewardship of ICANN’s performance as the administrator of the IANA functions. It feels like the moment when the multistakeholder community’s training wheels come off.

Our hard work for the last 15 years has led us to this milestone, when the U.S. government acknowledged that we as a community have performed our work with distinction and in alignment with our mission.

It is with great humility that we accept this trust and begin the work of developing and strengthening the accountability mechanisms that will be needed to give the world confidence.

During ICANN’s 49th Public Meeting in Singapore in March 2014, we launched a public dialogue on what the process for transitioning from the U.S. government’s stewardship should look like. On 8 April, we posted the initial results of that dialogue, along with a scoping document [PDF, 456 KB]written based on feedback from the community, and consistent with previous discussions.

Specifically, I wish to assure you that the U.S. government, including the NTIA, has approved the scoping document [PDF, 456 KB], and it is consistent with the views of the leaders of various Internet organizations including the Internet Engineering Task Force, the Internet Society and the Regional Internet Registries.

In addition to the public dialogue on the process for the transition of the U.S. government’s stewardship role, we launched at the same time in Singapore a second public dialogue on the broader discussion of how to strengthen the ICANN’s accountability. This second dialogue will look at strengthening existing accountability mechanisms like the Affirmation of Commitments, and ICANN’s redress mechanisms, as well as exploring new accountability mechanisms where necessary. We will share documents on the scope and proposed process of this second dialogue shortly.

These two dialogues will run in parallel, as they are certainly inter-related and will inform each other. A key difference is that the first process will be a public dialogue held in various venues across the global Internet community with a goal of developing the transition proposal requested by the U.S. government. The second is related to ICANN structures and the ICANN community, and the dialogue will mainly occur in the ICANN community while open to all.

These two public dialogues are real-time demonstrations of just how open and inclusive the ICANN community is, and will further prove to the world that we have earned the U.S. government’s confidence in our processes. This is our time to show that the multistakeholder model no longer needs its training wheels. It is ready for a long, steady ride forward.

{ 1 comment }

Five months ago, ICANN launched the new WHOIS website, a one-stop shop for questions and information about everything WHOIS, the Internet’s record system of domain registration data. This new portal was just the first part of ICANN’s goal of implementing substantial improvements to the current system, based on the recommendations outlined in the Action Plan [PDF, 119 KB], a series of recommendations from the WHOIS Policy Review Team on how to move forward with improving WHOIS.

I am excited to announce that the new ICANN WHOIS Lookup tool, a key deliverable under the Action Plan, is now available on the WHOIS website in Beta format.  It features a centralized search tool where users can find WHOIS data about any top-level and second-level domains registered on any gTLD under contract with ICANN; even those that have been newly delegated into the root of the DNS.

This new system, which was created as a direct result of the community’s input, is designed to be a user-friendly, educational tool, above all else. Among its features are automatic translation of the data types into the user’s preferred language and a user-friendly, guided experience in performing WHOIS lookup to research domains. The information displayed to users through the search tool is not stored by ICANN, but retrieved by accessing the WHOIS services of registrars and registries through Port 43 in response to a user’s query.

ICANN is committed to continuing to improve the WHOIS Lookup tool and the WHOIS website in general. Your feedback is a critical part of the new lookup tool’s success, so please try it out and send us your feedback at http://whois.icann.org/en/submit-feedback. This is the first time ICANN has provided a comprehensive lookup tool to query WHOIS results, so any and all comments are encouraged.

The new WHOIS website and Lookup tool is just part of how ICANN is implementing the recommendations of the WHOIS Review Team. To find out more about the WHOIS Lookup tool or where other improvements stand, download the Draft Implementation Plan or the latest Implementation Status Chart.


Our main role at ICANN is to coordinate and manage those unique identifiers that make the Domain Name System work. It’s probably fair to say most Internet users in Africa, like elsewhere, don’t really care about the IP addresses assigned to their machines or the domain names under which their website or email systems operate. There’s even less understanding of trademark protection issues and dispute resolution systems.

It is against this backdrop that ICANN has organized an all-Africa Workshop on Domain Names, Trademarks and Users Rights Protection to be held in Cotonou, Republic of Benin, on May 5-6. The workshop is part of the strategy designed by Africa Strategy Working Group in 2012. The strategy has now become a new tool of engagement of ICANN with the Continent. Read about it here.

Indeed, in Africa, companies that develop websites generally choose user’s domain names for them. Most users have little knowledge of WHOIS – a simple tool that provides basic information about domain names. This can affect a user’s activity online as a domain name owner, for example, might not be aware of basic renewal information.

Moreover, very few people on the Continent care about registration of trademarks. The new gTLD program offers registrants the opportunity to choose among more than a thousand domain name extensions. Those who already have domain names could consider the option of registering them under new domains, (ccTLDs or gTLDs). But they also should have an understanding of Registrant Protection Mechanisms (RPM) inter alia, the Sunrise period and the Trademark Clearinghouse.

With many countries now organizing their own intellectual property rights structures, it would benefit them to understand the linkages, or lack there of, among domain names, brands and trademarks as well as intellectual property right protections mechanisms.

Therefore we thought it important to provide a venue to explore trademark protection and dispute resolution mechanisms at both the country code (ccTLD) and generic levels so that best practices could be shared with those developing such mechanisms.

That’s what the Cotonou Workshop will do – address the trademarks and rights protection issues that ccTLD registry managers, registrars and registrants in Africa are facing by providing a platform for experienced intellectual property practitioners to share their ideas. We believe it will empower African ccTLD managers and registrars to implement more quickly effective measures for their ccTLD security, management and promotion.

So let me invite you to join the Cotonou Workshop on May 5-6, 2014. More information is available at http://cot14.africanncommunity.org. See you there!

Yaovi Atohoun is ICANN’s Stakeholder Engagement and Operations Manager for Africa


The Heartbleed Bug: Are you at risk?

by Dave Piscitello on April 9, 2014

Researchers have uncovered a vulnerability in OpenSSL, a software that provides secure (encrypted) communications for electronic commerce, banking, and secure remote access (SSL VPN). This vulnerability has been termed the Heartbleed Bug. An attacker who successfully exploits this vulnerability can read data from the memory of an attacked server. If the attacker is able to obtain the server’s private encryption keys from server memory, the vulnerability would allow attackers to decrypt and eavesdrop secure transactions or communications.

OpenSSL is extremely popular and used by an estimated 1/2 million web sites to encrypt their data.

ICANN is aware of the Heartbleed Bug. While the vulnerability does not affect the DNS, ICANN’s Security Team is urging top level domain registries, registrars (and their resellers) who provide e-merchant services for domain registration and other online services who use OpenSSL to upgrade to OpenSSL 1.0.1g, a version of OpenSSL that mitigates the threat from the Heartbleed Bug.

Organizations that use SSL-based Virtual Private Networks for secure application access should also take measures to mitigate this threat.

If you are looking for additional information on the Heartbleed Bug I would recommend the following three pieces:

{ 1 comment }

IANA in Transition: Update

by Theresa Swinehart on April 7, 2014

We received an overwhelming response to our call for feedback in the design of a community-driven process toward achieving transition of NTIA’s Stewardship of the IANA Functions. From the launch of the public comment process in Singapore two weeks ago through last Thursday, we received numerous ideas on principles and mechanisms and are now incorporating them into our consultation materials. We also are taking into account feedback received from our I* partners.

As a result, we need a little longer than the anticipated 7 April deadline for finalizing and posting the scoping, draft process proposal and timeline documents for further public dialogue. We will now be posting COB on 8 April (UTC). Thanks for your understanding and stay tuned!


Recognizing Our Community Leaders

by David Olive on April 2, 2014

by David Olive, Vice President of Policy Development Support

A group photo of ICANN's Volunteer Leaders

ICANN Celebrates Volunteer Leaders: (Left to right) Dr. Steve Crocker, José Arce, Sylvia Herlein Leite, Holly Raiche, Avri Doria, Rafik Dammak, Chris Chaplow, Marie-Laure Lemineur, Roelof Meijer, and Fadi Chehadé.

ICANN’s growth and evolution as a multistakeholder organization depends on the sustained engagement of our community. Indeed, our greatest asset is community member time and commitment to the work of ICANN.

During ICANN 49 in Singapore, the ICANN community recognized ten leaders for their service. These leaders included David Archbold, Geographic Regions Review Working Group chair, and Rafik Dammak, Non-Commercial Users constituency voting delegate on the Nominating Committee.

From the At-Large community, we thanked Avri Doria, New gTLD Working Group chair; Holly Raiche, APRALO chair; José Arce, LACRALO chair; and Sylvia Herlein Leite, LACRALO secretariat. Roelof Meijer was honored for his service as ccNSO councilor, and Chris Chaplow was acknowledged for his service as Commercial Business Users constituency vice chair in the GNSO.

Our community also paid tribute to the lives and careers of two past leaders. Marie-Laure Lemineur, chair of the Not-for-Profit Operational Concerns constituency, remembered Alain Berranger for his service as the inaugural chair of NPOC. Bruce Tonkin, vice chair of the Board of Directors, gave remarks about Jon Bing and his tenure on the GNSO Council. Both Alain and Jon leave lasting legacies at ICANN and continue to inspire their colleagues and our community.

Dr. Steve Crocker, Chair of the Board of Directors, concluded the program by formally thanking our community participants on behalf of the Board of Directors. The Board of Directors also approved a resolution at its public meeting on 27 March recognizing the leadership and contributions of our community members.

The Policy Development Support team is committed to recognizing our community members for their efforts. They are critical to the success of policy development work at ICANN. We are deeply grateful for their dedication and for their leadership in our bottom-up community.


by Christopher Mondini and Riccardo Ruffolo, ICANN Global Stakeholder Engagement

Fadi Chehadé sharing some words in an opening video he sent to the RightsCon - Silicon Valley

“…the key questions of human rights and accessibility of the Internet for all, and inclusivity of all interests [have] become very important…”

These were some of the words Fadi Chehadé shared in an opening video he sent to the RightsCon – Silicon Valley conference that took place in San Francisco March 3-5 2014 to express ICANN’s support. ICANN joined ISOC, the Internet Society, as one of the many sponsors of the event.

RightsCon is convened by Access Now and brings together human rights activists and Silicon Valley business in support of the dual mission of the organizers: to defend and extend the digital rights of users at risk around the world and to fight for open and secure communications for all.

This year more than 700 attendees from 65 countries and 375 institutions attended. Some of the world’s leading human rights experts, investors, corporate leaders, engineers and activists came together to tackle human rights challenges in tech. Government leaders from Estonia, Sweden, the U.S. and other countries were also on hand, making RightsCon a good multistakeholder venue to engage on Internet governance issues with a healthy dose of input from civil society organizations.

RightsCon 2014 was also a great place to be inspired by how organizations are using access to the open, global Internet for admirable aims: to document war zone atrocities, monitor environmental degradation and report on human rights abuses. On the business side, it was extremely informative to see how global tech companies are examining their own responsibility and capabilities to address thorny issues they navigate in relations with users, governments and shareholders.

A room with an array of distinguished panelists: Bertrand De La Chapelle, Anja Kovacs, Nnenna Nwakanma, Chris Riley and Carlos Affonso Souza

In the ICANN session, “Internet Governance 101: what’s at stake in 2014.” The room was packed, and we were fortunate to lead a conversation with an array of distinguished panelists:

  • Bertrand De La Chapelle – Director of Internet & Jurisdiction Project
  • Anja Kovacs – Director of the Internet Democracy Project
  • Nnenna Nwakanma – Regional Coordinator , World Wide Web Foundation
  • Chris Riley – Senior Policy Engineer at Mozilla
  • Carlos Affonso Souza – Director of the Instituto de Tecnologia e Sociedade (ITS) at the Getulio Vargas Foundation

The panel offered diverse perspectives on Internet governance, covering the views of different stakeholders from Europe, India, Brazil and West Africa. The same diversity of views was showcased during a separate session on the NETMundial meeting in Brazil “Sao Paulo and Beyond: the Future of Internet Governance.”

A couple of the shared conclusions of both panels: Internet governance is complicated, and it is not always easy for the newly informed to get involved. A tweet from the CEO of Spiegel Online captured the sentiment nicely:

A tweet from the CEO of Spiegel Online Katharina Borchert 'In case you we're wondering what internet governance processes look like. A scray flow chart #rightscon pic.twitter.com/33pzfO21tD'

Panelists committed to continue work to make organizations like ICANN and forums like the IGF and NetMundial better known and to create more platforms, like learn.icann.org to make participation by newcomers easier. If attendance at the RightsCon sessions is any indication, a cohort of knowledgeable and passionate stakeholders is eager to get involved.