Generative AI: The Plagiarism Machine - Episode 306 - Show Notes

Generative AI: The Plagiarism Machine - Episode 306

Sunday Apr 2, 2023 (00:48:27)

Description

Over the past few months, the popularity of generative AI has increased significantly. Earlier in the year, Avram discussed the dangers of generative AI, but what he saw then was only the beginning of the problem. As it turns out, with the implementation of conversational chatbots in Bing with Sydney and Google with BARD, the future of the Free Web is in danger.

What is Generative AI?

Generative AI is a type of artificial intelligence that uses machine learning algorithms to generate new, unique outputs from existing data. A generative AI system can create anything from images and videos to text and audio, based on the input it receives. For example, researchers have used generative AI systems to produce realistic-looking faces and vehicles, or music and stories.

The newest type of generative AI is conversational chatbots. These are systems that can interact with humans in natural language, and generate responses based on the conversation. They have become increasingly popular tools for consumer-facing services and applications, particularly in customer service scenarios.

The problems with Conversational bots in Search

The main concern with generative AI is its potential to disrupt the way web content is created. Since it can generate unique, realistic-looking outputs quickly and cheaply, generative AI could be used to create counterfeit products or fake news that are difficult to distinguish from the real thing. Additionally, if a single source of data is used for training, the generated content may be biased and inaccurate.

Furthermore, generative AI systems can generate outputs that violate copyright laws or contain offensive content. These systems are less an AI system and more a large dataset version of the word suggestions above the keyboard on your smartphone. On your phone, the keyboard uses your behavior to determine what the next most likely words are. With generative AI systems, they use a large dataset to do the same thing, from various authors, publishers, and more.

Plagairism in action on Google

Often times, and especially when a topic is niche, the dataset is small and therefore the likely next word is incredibly predictable. One of the best examples of this comes from Avram Piltch himself. He has become one of the loudest voices on the topic and has regularly tested the conversational systems. While testing Google BARD, he encountered an interesting issue - Google plagiarized one of Tom's Hardware's own articles. When Avram called BARD out for it, it agreed and apologized.

However, after Avram took screenshots of the interaction and wrote an article about the experience, he re-engaged BARD. He asked the system about the experience and was told that the author of the article (himself) had falsified the screenshots and lied about the experience in order to damage the reputation of Google BARD. A bit of an overreaction to something that it had admitted to just a day or two earlier.

But, the problem is that BARD has to respond this harshly because the reputation is already damaged. The initial release of the system cause a huge value drop on Google's stock because of errors in its answers. The real problem, however, is the actual value of the system for users and publishers.

Participants

Scott Ertz

Host

Scott is a developer who has worked on projects of varying sizes, including all of the PLUGHITZ Corporation properties. He is also known in the gaming world for his time supporting the rhythm game community, through DDRLover and hosting tournaments throughout the Tampa Bay Area. Currently, when he is not working on software projects or hosting F5 Live: Refreshing Technology, Scott can often be found returning to his high school days working with the Foundation for Inspiration and Recognition of Science and Technology (FIRST), mentoring teams and helping with ROBOTICON Tampa Bay. He has also helped found a student software learning group, the ASCII Warriors, currently housed at AMRoC Fab Lab.

Avram Piltch

Host

Avram's been in love with PCs since he played original Castle Wolfenstein on an Apple II+. Before joining Tom's Hardware, for 10 years, he served as Online Editorial Director for sister sites Tom's Guide and Laptop Mag, where he programmed the CMS and many of the benchmarks. When he's not editing, writing or stumbling around trade show halls, you'll find him building Arduino robots with his son and watching every single superhero show on the CW.

Live Discussion

Powered by PureVPN

We're live now - Join us!
PLUGHITZ Keyz

Email

Password

Forgot password? Recover here.
Not a member? Register now.
Blog Meets Brand Stats