Mer
1
Thanks to everyone who responded to the post Seeking Your Feedback on the Co-Design Process. We are accepting responses on this form through Friday, November 29th. Here is what people are saying so far:
Respondents
The nine (9) respondents thus far have reported deep involvement in the co-design process. The mean activities reported was 3.8 and the median was 3.5. The most common co-design activity of respondents was commenting in this forum. No further demographic information was requested, though about half of respondents chose to share their name.
What I Wish Had Been Different
Critiques of the process included requests for more cohesion among different co-design activities, as well as requests for more and less input from different types of commercial entities. There was also feedback on voting, learning, and a couple of challenges to premise of the process.
-
Differing Views on Commercial Participation: One respondent wished “[t]hat representatives from companies producing (non open source) AI systems did not participate in the system review.” Another respondent wondered if there should actually have been more representation “from a commercial POV.” This second respondent noted that “[c]ompanies that are interested in incorporating AI systems into their apps and products (such as leveraging an LLM) are going to ultimately face the test in the markets (and courts).” The respondent guessed that individuals from these companies might not have participated because they “tend to ignore such design processes early on because they’re busy running their businesses.”
-
Less Use of Voting: One respondent wished “the vote step” on which components should be required for a system to be called open source “did not happen.” This respondent felt that “it was not particularly meaningful and contributed in the end to reduce the credibility of the process.”
-
More Learning Moments: Another respondent acknowledged that “[t]he process involved a lot of informal pedagogy about terms, research and technical fields, which is to be expected” However, they “wish[ed] that the learning moments had been held in advance, so that the discussion about the definition could take place with people who had the same information background.”
-
Challenging the Premise: One respondent was dissatisfied with the premise of the co-design process, which was that open source AI could and should be defined. They wrote, “Open Source and AI are oxymorons.” Instead, they proposed that “[i]t should be Open Core AI and a list of training data + seed number provided.” Another respondent gave an even stronger criticism of the co-design process, saying they wished “Everything” about the process had been different and appreciated “Nothing.”
-
More Cohesion: Another respondent wished there had been more cohesion between different parts of the co-design process. She wrote, “I fel[t] like I was working on ‘parts’ rather than understanding the entire process.” As a solution to this approach, she wished that “the co-designers had shared the blueprint together” and that this would have helped co-designers to be “more proactive in the process.” Another respondent felt similarly. “There were various working groups within the OSAID co-design process,” he noted, “but I only recently learned the details of their existence.” He wished that both working groups and participants’ affiliations had been shared more actively. He also wished there had been more cross-pollination between co-design venues, specifically that more working group participants had shared their views in this forum.
What I Appreciated
In terms of what went well, respondents appreciated the co-design methodology, as well as the pace of the process and the care taken by both OSI staff and co-design volunteers.
-
Participation and Transparency: One respondent appreciated that “it was co-design in the first place, open to many stakeholders.” Another respondent appreciated “[t]he effort” to use co-design. Another appreciated that the process provided participation platforms, specifically “the approach of using both HACKMD and the forum in tandem.” Openness was an important part of what respondents valued. One appreciated that the process “happened in public.” Another hoped that “the remaining work” to revise and maintain the definition would also have “a clear, transparent, precise… process.”
-
Reliable and Predictable Pace: The tempo of the co-design process was also appreciated. One respondent felt it “proceeded at a clear, steady pace.” Another appreciated the “speed between proposals and implementation… without unnecessary procrastination.” Still another respondent appreciated that “[t]he pace of the progress was good and predictable.” Another noted that “releasing drafts incrementally…worked well.”
-
Patience and Respect: Given the long duration and challenging nature of the work, one respondent said they appreciated “[t]he patience of OSI staff as well as the time and effort of the volunteers to dig into something they really care about.” Another respondent appreciated that “[p]eople were nice and respectful.”
-
Well-Coordinated: Another respondent appreciated “the coordination, despite many controversies…which resulted in the release of version 1.0.” This respondent concluded by saying that she was “grateful for being a part of this important project.”
Thanks to those who gave feedback as this helps us improve our work. If you would still like to contribute your thoughts, the form will be open to responses through Friday.