Language-generative AI May Hinder Scientific Research

In April 2022, Nature magazine reported that the University of Michigan released a report on the social impact of emerging language-generative artificial intelligence (AI) technologies. One of the scientists Shobita Parthasarathy warns that software designed to summarize, translate and write like humans might exacerbate distrust in science.

To summarize, write, translate, answer questions, and even generate code like humans, based on large language models, emerging language-generative AI technologies use machine learning algorithms that create a fluent language from large amounts of text. The big tech companies that build such models will use this technology in chatbots and search engines.

When language-generative AI is involved in the scientific field, it helps to find information quickly. However, machine learning algorithms can be flawed. It could contain outdated information and ignore differences and uncertainties. Also, the language-generated model is non-public, and the input learning text may be based on some dataset. At the same time, the interaction with these AI tools is very personalized, and each user gets their own generated information. When users are unaware of the limitations of language-generative AI and rely on the information obtained from the technology, they may come up with biased or simplified scientific arguments that contradict complex realities.

In particular, the report recommends that transparency is critical for the use of language-generative AI in science. Developers should clearly state what text is used in the machine learning process and the algorithm’s logic. Large scientific publishers should participate in the development of these AI tools. Scientists should be wary of whether journals or funders rely on the technology to find peer reviewers, evaluate manuscripts, or conduct funding reviews.

Picture of Aussie Brief News

Aussie Brief News

Go to First Page and Get the Latest News.

Translator: MOS English Team- Xiequyuan
Design&editor: Hbamboo

Leave a Reply

Your email address will not be published. Required fields are marked *