Skip to content

Commit

Permalink
Make Janis amends to CC Gen AI
Browse files Browse the repository at this point in the history
  • Loading branch information
emilyjmacaulay authored Oct 31, 2024
1 parent 9aaff6f commit 790d8c7
Showing 1 changed file with 3 additions and 2 deletions.
5 changes: 3 additions & 2 deletions _events/2024-10-17-connected-conversation-gen-ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,10 @@ In this Connected Conversation we brought together people advocating for worker
Our three speakers shared their provocations around generative AI and workers’ rights.

### [Janis Wong](https://janiswong.org/about/), Data & Technology Law Policy Advisor at [The Law Society](https://www.lawsociety.org.uk/)
There is a history of technology being deployed in the legal system ([Law Tech](https://www.lawsociety.org.uk/topics/ai-and-lawtech)) and AI is now starting to be implemented in ways such as case management and research. This is raising questions for the profession about client confidentiality, legal privilege and responsibilities towards regulators and clients. AI impacts across the profession differently as there are many different organisations in the space - from law firms that are [innovating and developing](https://www.legalgeek.co/conference/) AI systems, to SMEs and house counsels. Some are merely seeking to get to grips with understanding ‘off the shelf’ generative AI and The Law Society is seeking to ensure that a two tier system is avoided (the have and have nots). There is also a ‘service’ nature of the legal profession, where solicitors (for example) are in service to their clients. This raises a question about [client needs and wishes in relation to AI being used](https://www.lawsociety.org.uk/campaigns/21st-century-justice).

From a workers’ rights perspective there are many different roles within the legal profession (e.g. barristers, solicitors, paralegals, court staff) and all are impacted by “billable hours”. Any AI deployment that impacts on those, will have wide ranging repercussions. Those employed in the legal profession are not traditionally considered to be “data workers” but increasing as technology developments are implemented, they are.
There is a history of technology being deployed in the legal system ([Law Tech](https://www.lawsociety.org.uk/topics/ai-and-lawtech)) and AI is now starting to be implemented in ways such as case management and research. This is raising questions for the profession about client confidentiality, legal privilege and responsibilities towards regulators and clients. AI impacts across the profession differently as there are many areas of practice and different organisations in the space - from law firms that are [innovating and developing](https://www.legalgeek.co/conference/) AI systems, to SMEs and in-house counsels. Some are merely seeking to get to grips with understanding ‘off the shelf’ generative AI and The Law Society is seeking to ensure that a two tier system is avoided (the have and have nots). There is also a ‘service’ nature of the legal profession, where solicitors (for example) are in service to their clients. This raises a question about [client needs and wishes in relation to AI being used](https://www.lawsociety.org.uk/campaigns/21st-century-justice).

From a workers’ rights perspective there are many different roles within the legal profession (e.g. barristers, solicitors, paralegals, court staff) and all are directly or indirectly impacted by the “billable hours” business model. Those employed in the legal profession are not traditionally considered to be “data workers” but increasing as technology developments are implemented, they are.

The UK’s Master of the Rolls, our second most senior judge, is a strong advocate for technology and would offer a provocation that the future legal system may include a scenario that the legal profession may be liable if *not* using technology, if it is proven to be more accurate and efficient.

Expand Down

0 comments on commit 790d8c7

Please sign in to comment.