One of the biggest specialist file sharing platforms on the planet announced significant policy changes earlier this month. Then backtracked. And the story won’t end here.
It’s a story of our times, in the most timely way. Early July WeTransfer — a major file sharing platform and the preferred service for many who need to send large folders around the world instantly — revealed an update to its terms and conditions. This stated that anyone using the system was automatically permitting their content to be used to train artificial intelligence.
An uproar ensued. The company then issued a statement saying it would never use audio, video, design decks, illustrations, document drafts, or any other file type sent via its service to feed generative AI’s insatiable appetite. The repetitional damage has been significant, and many in industries from film to music to marketing have been busy signposting peers to Swiss Transfer, a rival and previously pretty obscure file sharing platform which has not threatened to use materials to boost machine learning.
Understandably, the situation has made many feel uncomfortable. Those at WeTransfer have managed to darken the skies above a previously well-respected name in the eyes of its core user base. Meanwhile, the professionals who rely on this system to share important work with peers, clients, and colleagues realised how vulnerable they are to sudden policy changes. And even if you’ve never used the platform in question, there’s a lesson in this.
The backlash against WeTransfer has been sufficient enough it warrant a complete u-turn — albeit veiled as a denial there was ever any intention to start using original work to train AI. But this is one reasonably-sized fish in a medium-sized pond where competition is fierce and there are many options users can turn to if they need to share large files. Granted, not all of them are free, but several are. Including those provided by Microsoft — OneDrive — and Google’s own Drive.
Just last week, the UK Government came to a long-anticipated agreement with Google Cloud which, on paper, is logical. The idea is to use the tech giant’s benchmark-setting services to phase out legacy systems and, as Technology Secretary Peter Kyle puts it, give public sector organisations “the market clout” to secure the best possible value for money.
At the same time, Downing Street is ploughing ahead with its push to introduce artificial intelligence into the civil service en masse. We’ve seen a number of ‘in house’ (comparatively speaking) programmes developed, and the idea makes sense on paper. But an agreement with OpenAI — the firm behind ChatGPT — minted in the last few days remains shrouded in mystery.
Critics include Chi Onwurah, Chair of the Commons Science & Technology Select Committee, who has demanded more transparency over the i-s and lower-case j-s. Specifically, there’s a need to ensure that all public data will remain protected under national legislation, and will not fall into the kind of free-for-all we’re seeing when it comes to AI and industries dealing in copyrighted materials. Music, digital art, design, video, and writing are all being taken, without express permission, and used to improve programmes that can make music, produce digital art, design imagery, create video and write words without a person.
So there are plenty of reasons to feel sceptical about the deal. Not least given the Post Office Horizon scandal is still a very raw and recent example of just how bad AI implementation can be. But we’re also seeing a furious race within artificial intelligence. And parliamentary debates over the impact this sector will have the UK’s much-lauded creative industries show how unwilling governments are to stand in the way of progress at any price. At what other point in history have sectors been asked to hand over the rights to work so robots that may replace human professionals can be trained?
Of course, pressure is mounting on all tech companies to keep pace, and that means a desperate land grab for materials — any and all — that could prove useful in developing advanced AI models. In many ways, the companies themselves are just as much a victim to the lack of rules and laws stopping them consuming everything in a bid to make anything faster and at a lower cost.
Historically, the public sector’s greatest technological gripe was being sold overpriced, often outdated solutions, which, often, didn’t connect to other solutions in other areas of government. Just a few years ago, the idea of running civil service IT in the same way as private sector IT was, for many at management level, a pipe dream requiring budgets beyond their wildest hopes.
So it’s ironic that the will to overhaul public sector technology procurement, direction, functionality and networking is finally moving towards where it should have been years ago, at the exact moment when the change means engaging with a completely unregulated sector. And an industry which has already shown itself to be uncompromising and unwilling to negotiate its future development on anything like fair terms. A landscape in which even remotely vague contracts should raise red flags, because changes to terms and conditions can plunge users into panic and render them effectively powerless to protect their most valuable data.
Image: Alexey Demidov / Unsplash
More Data Management:
Gigabit broadband for 63,000 premises in north-east Scotland
Leave a Reply