
This policy roundtable on the 7th January 2026 brought together 20 expert stakeholders to discuss AI, copyright, and the future of the novel. Participants included writers, publishers, parliamentarians, policymakers, peers, lawyers, trade unionists, and industry experts.
This roundtable came off the back of the work being done by the Minderoo Centre for Technology and Democracy at Cambridge University and the Institute for the Future of Work. This includes the BRAID UK AHRC funded research from Dr Clementine Collett (BRAID Research Fellow, MCTD, University of Cambridge), resulting in the report ‘The Impact of Generative AI on the Novel’ which was published alongside a shorter policy brief, and IFOW’s research and report ‘Creative Industries and GenAI’.
The research report on the impact of AI on the novel highlights the urgent need for policymakers and industry leads to understand how best to protect writers and the novel, part of a major success story in the UK, but one that is being put under threat with the widespread adoption of these new technologies.
The purpose of this roundtable, chaired by Professor Gina Neff, was to have an informative and constructive discussion around the future of creativity, writing, publishing, and the novel in the age of GenAI, and to think practically about what can be done to address the issues we face.
Major themes discussed during the roundtable included:
1. Principles --> Solutions
There was consensus that the last year has enabled alignment on core principles around copyright, including the need for transparency, remuneration, control of creatives, and licensing. Participants called for the focus to now be on turning these principles into feasible and effective solutions. The technical working groups run by the Government will be crucial to this.
Particularly, there was a call for concentration on technical solutions concerning a licensing marketplace. One participant spoke of a need for exemplars of a license, to allow creatives to understand what a license might look like. This would also enable creatives to feed into these conversations. Some speakers also discussed how difficulty would arise if a copyright regime were changed partway through exploration of these solutions.
2. Political Context
The roundtable included discussion about how practical and technical solutions around copyright and licensing must be held in the current and adapting political and geopolitical context. Due to this context, one participant spoke about how we must ask questions around what we can do without the Government’s blessing, what we can do technically on factors such as readability, and what we can do politically to get these technical solutions inserted.
3. The Importance and Nature of Transparency
Trust and transparency were key themes of the discussion.
Participants pointed out that due to the history of non-compensation for authors, along with no permission having been sought for the use of their work, authors’ trust will be hard to gain. However, participants spoke of how publishers are committed to developing and piloting these practical solutions.
It was noted that one way to build trust with creatives will be through licensing solutions. However, it was questioned how we will get there without clear transparency from technology companies.
Transparency was spoken about not as a moral gesture, but as a fundamental part of a functioning market. There was a call for the development of clear licensing solutions, and the delineation of the nature of ‘good transparency’. Certain participants spoke about how an AI Bill could also be a crucial factor in addressing and legislating for certain types of transparency on training data, web crawling, and so on.
The need for transparency from companies with market insight – for example, eBooksellers – was also mentioned as important, so that the extent and nature of AI-generated content could be assessed on an ongoing basis.
4. The Income of Creatives
An important point was made around the impacts on creatives’ income. Many creatives have supplementary streams of income to support their work, and this is a major way in which they are losing work, and struggling to sustain creative careers. One participant spoke of recent research on comics makers: 12% say they have lost work and 24% say they suspect they have lost work, but this is usually the freelance work that supports their work. If this persists, it is likely that we will be left with just the people who can support themselves to make art, which will cause major socio-economic disparity in the voices represented within our country’s creative work.
5. Opportunity to Address Impacts on Children
There was widespread fear around the implications for children – all the way from early years through to teenagers - regarding AI and social media. There was discussion around engaging with children and with technology companies on this. One participant spoke about how, while the technology companies are often the problem, they might also be able to help with solutions. There was consensus that we must grasp the opportunity to bring reading back to the heart of the lives of children.
The de-skilling ability of AI was also discussed, and participants were passionate that we must ensure that children don’t lose the opportunity to develop the skills which are formed from reading and writing.
6. The Productivity Narrative and Friction
During the roundtable, there was discussion around how the narrative of productivity is central to the Government’s backing of AI. However, some participants argued we should challenge the Government on this. There should be less focus on productivity, and more on embracing the ‘friction’ of creative processes. It was noted that while many young people feel a creative career is not for them, this attitude needs to be addressed because the creative journey is important for their development. This was also discussed in relation to adults, who are often subject to the narrative that we must be productive, meaning they often don’t have as much time to read themselves or with their children.
One participant noted that we cannot divorce the question of the productivity narrative from the labour market. Nowadays, more people are studying STEM subjects and fewer studying literature – we have come to see the purpose of university as providing a job, and creativity as part of leisure, but this is a false narrative.
7. AI Use within Publishing
There was discussion from participants around how publishers address the antipathy of authors towards AI, and how this can be resolved by discussing positive uses for AI in publishing, focusing on how there is openness from some creatives, and ensuring responsible AI adoption. There was a call for transparency around how publishers are using AI.
8. Future Research Areas
Further research was called for in two areas:
a) Technical feasibility of transparency: one participant mentioned that while AI companies claim they cannot specify which data has been used to train models, court cases have shown that these companies are able to present this information when pressed. There was a call for more research around the technical argument to support that transparency is possible and feasible, and that there should always be a track record of training data.
b) Public value of AI generated content: there was a call for more research on whether the public see AI-generated, or part-generated content, as valuable, and why or why not. This could also be explored in relation to the labelling of AI-generated material and how it helps people to make decisions on what content they consume.
9. Expanding the Conversation
Multiple participants called for more groups to be brought to the table for open discussion, including:
a) Technology companies championing responsible AI: one participant noted that technology companies creating responsible AI systems are not currently involved in the working groups, but it is crucial to bring them to the table and ask what they need to build their models.
b) Big tech: there was some positivity from participants regarding engagement with big tech companies going forward, due to markets requiring answers from them regarding issues around liability, accuracy, and cost. Some participants spoke about the importance and the challenge of bringing these companies to the table with other stakeholders, because system design has an agenda which alters our social landscape, and we need to work towards this being the right one.
c) Young people: it was noted that - regarding conversations around the use of AI, perception of AI-generated content, and education around AI and reading, - we must include young people and children in this discussion to hear first-hand about their uses and perception of AI.
d) Diverse voices: there was a call to ensure that policy conversations are not conducted within a silo, but that they include a diverse range of people from different backgrounds.