Monday, September 2, 2024

A Scale AI Subsidiary Targeted Small Businesses for Data to Train an AI.

January is typically the slowest month for Dawn Alderson, who owns a hair salon in a Philadelphia suburb. So when she came across an online ad last December offering $500 to $700 to help train AI algorithms, her financial anxiety eased. She signed up to work for the company, Remotasks, in anticipation of the dry spell. Remotasks is a subsidiary of San Francisco-based Scale AI, a unicorn startup with a valuation of $14 billion. Alderson joined a booming field of contractors who either supplement their incomes or carve out full-time work developing generative artificial intelligence. In industry parlance, the work performed by Alderson and millions of contractors across the globe is called tasking. It is defined by hourly work logging mistakes made by AI tools such as chatbots, image generators, and voice-to-text technology so these tools can be deployed for commercial use. A tasker might, for example, teach a self-driving program how to identify common features of a street by tagging pictures of stop signs and traffic lights. Alderson's tasking journey started slowly. She completed onboarding and waited to get her first assignment. After a couple of weeks, she was invited by Remotasks to participate in an initiative that hit close to home, "collecting and using datasets that are relevant" to small businesses. "Your data could be key in shaping the future of AI in the world of business," said the query, obtained and reviewed by Inc. "For each dataset contributed that meets our criteria, we're offering a reward of up to $700 USD." The fee felt like a winning lottery ticket. Alderson recalls thinking, "Oh, my god, I have waited for this moment. I have everything that they're looking for." The project had a nonsensical code name: Bulba Ice. Business owners didn't know the identity of Remotasks' ultimate client, but the work was simple enough for a financial reward. The prompt called for routine insights into the daily life of a small business, with clearly defined questions answered by concrete numbers. Alderson included sales totals, the types of appointments made at her salon, and examples of her inventory. She submitted five data sets, three of which the company accepted. According to the requirements, the data sets had to be exhaustive and encompass between 50 rows and five columns on a spreadsheet. For the three data sets that were accepted, Alderson received $2,100. Jubilant, she felt the work-from-home windfall would replace her typical winter precarity. She locked in, providing 37 more datasets over the course of an intensive weekend of work in January. But after waiting for months and following up repeatedly with a Remotasks support account over email, payment for those 37 datasets never came. In March, the Bulba Ice project was curtailed without warning, and many contractors believe they were misled about a project that solicited information about their small businesses. Alderson found herself joining an outspoken group of contractors on a Remotasks Slack channel demanding answers for late payments. Those who complained faced swift retribution. Most of them had their access to the Bulba Ice Slack channel revoked without warning, according to Josh Bicknell, the owner of a tutoring company who participated in the Bulba Ice project and whose account was corroborated by several other contractors. He says at least 50 accounts were deactivated in the purge. Soon, a consensus emerged among the taskers: they were getting ripped off. They had provided private business information to an unknown AI project with the promise of compensation. There was a nagging suspicion among the group that the data could be used to build an AI that could ostensibly harm their businesses: Like building a tool that's then marketed back to them as an essential product. Or having the data sold to competitors whose new technological prowess could leave them in the dust. Alderson discovered she had been deactivated from the Bulba Ice Slack channel on March 20. Three weeks later, she received an email that her 37 data sets had been rejected for repeating certain questions in separate submissions--something a Remotasks manager had told her was allowable, she says. Alderson believes she's owed $25,000. AI tasking presents small-business owners with unknown risk The Bulba Ice project was different from traditional data labeling, which typically involves training AIs to accurately identify what's shown in an image. (Does this picture show a car or a motorcycle?). But it shared one striking similarity with the broader AI tasking economy: Contractors are almost always in the dark about what they're developing and whom they're developing it for. "If you're labeling some aerial image that's produced by a drone, are you training a toy airplane, or are you training a military drone? There's no real way of knowing," says Mark Graham, a researcher at Oxford University's Fairwork Foundation The situation is emblematic of a larger problem in the creation of generative AI tools. For society to realize the benefits of this new technology, independent contractors-from impoverished people in the developing world, to struggling artists and authors, and even scholars with advanced degrees-provide an essential human touch to build a technology that could potentially render them obsolete. Small-business owners, many of whom have to scramble from time to time, have now gotten a taste of that dilemma firsthand. "Instead of accountability, or telling the stakeholders and the contributors anything about it, Remotasks completely cut down communication," says Lain Myers-Brown, who works at a comic book store and submitted data on behalf of its owner. You could argue that entrepreneurs should have known better than to get wrapped up with Remotasks and Scale AI. The company's international reputation is littered with accusations of non-payment. Scale has been accused of failing to pay workers in the Philippines and Africa, regions where it has traditionally recruited most of its taskers. Following reports in The Washington Post about Scale's business endeavors overseas last year, the company issued a statement: "Remotasks is our global platform designed for flexible, gig-based data annotation work. It was established as a separate platform to protect customer confidentiality." Scale's customers include Open AI and the Department of Defense. As interest in AI tools expands, so too have allegations of unpaid labor and deception among U.S.-based taskers, who complain of unpaid training sessions, and pay rates that fluctuate without explanation. In June, Inc. revealed that contractors working for Outlier AI, another Scale subsidiary that hires AI taskers, reported various instances of non-payment, despite Outlier's having recruited aggressively for hundreds of open positions. Outlier's taskers routinely ask questions about the company's policies and legitimacy on a Reddit channel that boasts 11,000 members. Last November, a class action complaint was filed against Invisible Technologies, another San Francisco-based data annotator, for violating various California labor codes, including failure to pay overtime, failure to pay timely wages, and failure to grant paid sick leave, among others. (Invisible Technologies did not respond to an Inc. request for comment.) Regarding Remotasks' small-business project, a Scale AI spokesperson said the company was overwhelmed by the responses. Bulba Ice "received greater interest than anticipated, and due to an influx of submissions our review process took longer than projected. We have communicated with each individual participant regarding the status of their submission, and payments for eligible datasets have been paid in full." Several participants in the Bulba Ice project refute those claims. "Nothing has been communicated individually," says Myers-Brown. "I have no faith that any person involved in the project who was employed by Scale has any idea how to communicate across departments, how to track tickets or emails, or maintain data governance rules." A whistleblower report filed to the U.S. Government Office of Accountability in March alleged missed payments on a mass scale. There was no discernible difference between data sets that were accepted and datasets that were rejected, said the tasker, who runs a North American logistics company and declined to be identified. While many involved in the Bulba Ice initiative were paid for data they provided, the threat of legal action was often necessary, according to emails reviewed by Inc. In May, the tasker William Webster, who submitted data from a marketing firm on behalf of the company's owner, wrote an email to a Remotasks support account, complaining of late payments. Only after he said he'd call a lawyer did payment of $5,600 land in his bank account, four months late, Webster says. And while a company email to contractors in May said it would be deleting all rejected datasets within 30 days, small-business owners who spoke to Inc. claim there's no way to prove it. Some involved in the project say that Remotasks did initially pay on time, but the punctual payments didn't last. "They seem to have paid a lot of people in the beginning to lure them in and have them send more data--and then they stopped paying regularly," Alderson says. The GOA referred the whistleblower complaint to the Office of Inspector General and the Department of Labor, both of which declined to investigate. The Equal Employment Opportunity Commission declined to comment. "I think that most of it has been paid," says the whistleblower. "But without the pressure we gave, I doubt that they would have been paid." The hard, human work of training AI The work of tagging and annotating information that large language models and image generators spit out is a process called reinforcement learning from human feedback, or RLHF. Without RLHF, programs like ChatGPT might offer only nonsense in response to prompts, or lack the near-human element that made generative AI an overnight sensation. Taskers usually work from home in a gig economy setup similar to Uber drivers--they aren't forced to work and they can, theoretically, create their own hours. That's if things are working properly. As the business world's demand for AI-enabled products swells, Scale AI is recruiting scores of gig workers to its subsidiaries with the promise of flexible work. Wages vary per task, and can range greatly, but usually fall $15 and $40 per hour. But according to eight contractors interviewed by Inc. who have worked either for Remotasks, Outlier, or both, overcrowding and minimal direction often lead to a culture of organizational chaos. The exact number of taskers working for Remotasks and Outlier is unclear, but they do crest into the hundreds of thousands. One Remotasks Slack general channel has 461,000 members, while another for Outlier has 107,000, according to screenshots seen by Inc. "It's just a mess," said a tasker who freelances for Outlier and asked to not be named for fear of repercussions. "You have no manager, you have nobody to help you. ... It's a massively disorganized shit show." Remotasks' attempts at management were fleeting. In March, the Bulba Ice Slack channel had one team leader available to answer questions. With more than a thousand taskers, the channel was cluttered and, in time, the manager grew overwhelmed. "I was the only tasker success manager working on that project, and we're talking about thousands and thousands of data sets," says the former manager, who asked for anonymity, fearing professional consequences. When Bulba Ice was abruptly halted after less than two months, the project's Slack channel erupted. When they had signed up, taskers were told that datasets would be paid for only if they were accepted by the Remotasks' client. But they expected to be informed of the client's decision and whether or not they'd be compensated. People wanted to know what was going to happen with their data and if they were going to get paid. The Remotasks success manager was instructed not to respond to questions. "I probably spent two weeks watching the contributors post messages in the Slack channel that I was advised not to respond to," the former manager says. "That was very hard, because that's not what I was brought on to do." What happens to all of the tasking data? Unlike most of the projects in the tasking economy, Bulba Ice asked for anonymized data gathered by real small businesses. Though the project called for no personally identifying information, it took on an invasive dimension for Myers-Brown. "That's your data. You own that data that you crafted from real events," she says. (Scale did not respond to an Inc request for comment about how it deletes rejected datasets.) The lack of transparency strikes some of the taskers as particularly galling, given the size of Remotasks' parent company. Scale announced $1 billion in new venture funding in May. The company's founder and CEO, 27-year-old Alexandr Wang, is the world's youngest billionaire and something of a tech wunderkind. He co-founded Scale in 2016 at age 19, and is chummy with Elon Musk, who publicly applauded a controversial screed Wang published earlier this year about meritocratic hiring. Last August, Scale announced a contract with the Department of Defense, allowing the military access to its internal data training platform."The capabilities will be made available across a diverse range of networks and classification levels relevant to warfighters," the announcement said. Wang would not comment for this story. He told Fortune in May: "I think the entire industry expects that AI is only going to grow, the models are only going to get bigger, the algorithms are only going to get more complex and, therefore, the requirements on data will continue growing--we want to make sure that we're well-capitalized." In many ways, tasking is the AI revolution's invisible bedrock. Official numbers on how many taskers there are and where they're located are sparse. A study by Google researchers in 2022 estimated that there are millions globally. Organizations such as Oxford University's Fairwork Foundation are studying the landscape and attempting to create governance guidelines with respect to pay and worker's rights. It's indisputable, however, that the industry is vast and making inroads in the U.S. and the Western world as a result of the generative AI gold rush, which saw $29 billion invested by venture capitalists across the globe last year. The challenge of managing a remote tasking workforce There are more than one million taskers alone on the platform of Appen, a data annotation firm based in Sydney, Australia, according to Samantha Chan, the company's VP of engagement and success. She says it's tough to keep that many contractors--who are dispersed globally--aware of pay policies, deadlines, and all the minutiae associated with their projects. "Some things may fall through the cracks--which is inevitable, working with that many people," Chan explains. Appen has plenty of poor reviews on sites like Glassdoor and Reddit. But the company allows taskers to communicate on its internal platform, and makes project managers available to answer questions, Chan says. The company has its leadership team listed on its website. Making such information public is a minimum requirement of transparency, says Mark Graham of Oxford University. That small measure of accountability is sorely lacking at most companies in the field, Graham claims. Taskers in the Bulba Ice project say they were usually forced to fend for themselves on sprawling Slack channels with little to no oversight. In its research, Fairwork issues annual reports ranking the companies in the digital gig economy on criteria related to pay, working conditions and transparency. Data annotation firms are among them: In Fairwork's 2023 ranking, which assessed firms on a scale from 1 to 10, with 10 being the best, Appen received a 3. Remotasks, which makes no mention of its parent company on its website, received a 1. Graham hopes the research conveys a simple message: "There's nothing impossible about creating decent work conditions."

No comments: