Almost all of us have experienced self-reinforcing problems. We get a bad night's sleep, our energy is depleted, and we rely on coffee to get through our to-do list. But the last drink that was supposed to arrive at 5pm comes back to haunt us, leaving us staring at the ceiling for hours at bedtime.
This process repeats until you finally cut through the afternoon caffeine jolt. As examples like this demonstrate, the term “vicious circle” is used for a reason. Breaking the cause and effect loop is difficult.
AI work crop is another example. Scientific American defines it as “mass-produced, low-quality content.” Fast Company calls this the “collective term for digital trash.” Although definitions vary, experts agree that AI worklops are widespread. Unfortunately, the solution to stopping it isn't as simple as reducing your coffee intake.
More efforts will be needed to combat the influx of indecent information on the internet, and workplaces feel the potential brunt.
Harper's Magazine reports that more than half of the internet is now generated by bots, creating “AI feedback loops that are collectively corrupting the very real and 'very true' human data that was supposed to make this technology so powerful.”
This means that when an AI model is trained on garbage (in this case slop), the quality of the output we receive will continually decrease. Harvard Business Review reports that many companies are already feeling the impact. Instead of increasing efficiency, this low-quality content is hindering productivity. Research shows that it creates additional “cleaning work” for employees (an average of two hours at a time) and can fuel dissatisfaction among co-workers. This is a pervasive problem, with the MIT Media Lab reporting that “95% of organizations 1765226033 AI provides no measurable return on investment. ”
For example, a manager may ask an employee to write a report about a business area to evaluate the success of that area. Time-poor individuals decide to upload their data to an AI tool to create their first draft of content. The output is sparse but highly readable, so the employee sent it without editing. A manager has a dozen reports to review and asks the AI platform to summarize its findings before uploading the still-bloated document to a shared drive. Did the use of AI tools help the company identify potential pain points to improve its operations? No, you may actually be wasting time and internal resources with little value.
What are the lessons to be learned? Rather than implementing them across the board, companies need to be smart and strategic about when and how they enable their employees to use AI tools. As Forbes points out, the “antidote” [to slop] It’s not about “AI is more than just AI,” it’s about using technology to “extend what’s already good” in the workplace. Companies should conduct an internal assessment of their needs and goals to determine where AI tools can add real value to their processes.
For example, can AI help perform repetitive or mundane tasks, freeing employees to focus on more complex or nuanced responsibilities where they have expertise? Once a company decides to deploy AI tools, it should establish and implement comprehensive governance policies, ideally with the help of experienced cybersecurity professionals, and ensure that all employees receive appropriate training on how to use vetted tools effectively and safely.
Recent headlines report that AI stagnation is “destroying productivity” and “disrupting America’s workplaces.” But AI isn't bad. This technology can help increase efficiency and expand capabilities, including in the workplace. If we want to break the cycle of work waste, we have a responsibility to use it responsibly.
Editor's note: Chris Wright is a co-founder and partner at Sullivan Wright Technologies, an Arkansas-based company that provides cybersecurity, information technology, and security compliance services. The opinions expressed are those of the author.
