Von Glitschka was just trying to teach people how to create designs in Adobe Illustrator.
Glitschka, half of the two-person design firm Glitschka Studios in Salem, Oregon, had shared the link to his Zoom video call on his Facebook, LinkedIn and Twitter profiles on Saturday, and a dozen or more people tuned in.
Without warning, one user took over the meeting, displaying a YouTube channel playing "some kind of neo-Nazi thing," said Glitschka, who had just started paying for Zoom. Then the person — a man with a French accent, Glitschka said — started annotating the screen with a racial slur.
"I was just embarrassed that they logged in just to see some creative stuff, and then they had to be exposed to that kind of idiocy," he said. He had been using the service for about 2½ years before his call was hit with the so-called zoombomb — a term that's been in the lexicon for only two weeks.
Now Glitschka is no longer posting links to his Zoom calls publicly. Instead, he's asking anyone interested in joining to send him a private message and then providing access information.
Zoom is having a moment as people turn to video chat as they're forced to stay in home to stem the coronavirus outbreak. It's well on its way to becoming a cultural icon and shorthand for video chat, much like we say "Google" for search or "Uber" for ride-sharing.
Zoom's mobile app now has over 32 million daily active users, 10 times more than there were a year ago, Bernstein Research analysts Zane Chrane and Michelle Isaacs wrote in a note distributed to analysts on Wednesday, citing data from privately held Apptopia.
But as the growing phenomenon of zoombombing shows, sudden popularity can tease out previously unforeseen problems. In the case of Zoom, if a conference organizer shares a link in public and doesn't take steps to limit access, anybody who sees the link can join the call — and do whatever they want on it.
On Monday, the FBI's Boston division issued a warning on zoombombings. The agency had received reports of meetings getting interrupted with pornography, threats or hateful content and discouraged people from sharing links to meetings on social media.
In addition to zoombombs, security is another emerging issue for Zoom as it keeps growing. On Monday, the office of New York's attorney general, Letitia James, asked Zoom in a letter to describe any changes it has made after a software developer found that Zoom's Mac app could turn on a person's camera without permission, The New York Times reported. Another person found that Zoom's Windows app can lead one to unintentionally share a "hashed" version of one's Windows account credentials, allowing them to run programs on the computer without the owner's permission, according to BleepingComputer. Zoom is working to address both issues, marketing chief Janine Pelosi said.
Increasingly, zoombombing attacks are being carried out by groups that seem to be working in coordination. This can lead to a particularly disorienting flood of images, and make it almost impossible to de-escalate without ending the meeting.
On Monday, Laurel Walzak, an assistant professor at Ryerson University's media school, organized an informal Zoom meeting to talk about sports. She had directed people to a website for the weekly talks where they could subscribe and gain access, and had also shared the link to the meeting.
About 30 people joined the call. Soon after it started, five to seven users started called up vulgar images or entering text comments in a chat window. Unsuspecting call participants gasped as unwanted music played and several people spoke.
"If I muted, whatever controls they had — they could unmute," Walzak said. "If I tried to shut the video down or delete them, another one of them would pop up."
She wanted to end or leave the meeting. One person on the call suggested she hit a keyboard shortcut to access the task manager on her PC, from which she could close the Zoom program.
Walzak hit the keyboard shortcut but her computer screen went green before showing filthy images, she said, causing her to wonder if a virus had infiltrated her machine. She restarted her computer and sent participants an email containing access information for a fresh Zoom meeting. That second meeting went off without a hitch.
John Saddington, founder and CEO of San Francisco-based business-software start-up Yen.io, has been paying to use Zoom for years. He had never been zoombombed before Monday.
Saddington, who has over 73,000 followers on YouTube, had upgraded his Zoom account to run webinars, and this was his first time sharing the stream on YouTube. He had circulated a link to join the Zoom call on Twitter and elsewhere. "I didn't expect much of it, to be honest, and didn't expect anyone to click it," Saddington said.
As more than 200 people watched the stream from YouTube, a cluster of around 20 people joined in just a few seconds. One appeared in ski mask, with blacklight showing in the background. Several people started speaking racial slurs as one user played videos of sex acts.
Saddington was overwhelmed.
"Do I close down Zoom or do I close down the YouTube live stream?" he said. "I kind of had a blue screen of death as all of this stuff was happening, and it was very loud and in my face." Instead, he totally switched off his computer. Text messages and tweets flowed in asking what had happened, and he apologized and said he wasn't ready to begin streaming again. He needed time to decompress, he said, and people understood.
Saddington said he had heard about zoombombing, and how to stop it.
"I knew exactly what it was. And yet, I didn't think that it would actually happen to me, and I'm a technologist," he said. "I really care about privacy and security, but that caught me off guard."
Some institutions have begun to offer advice on how to lower the risk of getting bombed, but hosts don't always heed it.
For instance, Stony Brook University warned faculty members this week in an email message and provided guidelines for prevention, including using the university's Zoom system and only permitting authenticated university users to join meetings.
Caitlyn Cardetti, a Stony Brook Ph.D. student focusing on cellular and molecular pharmacology and president of the school's Graduate Women in Science and Engineering group, received that email — it's just that she wanted to make sure people from a similar group at a nearby institute, Cold Spring Harbor Laboratory, could join an informal call she was hosting. She started a video call with her school account and shared the link on Twitter and other online venues.
The call had a dozen participants, including one woman with her 2-year-old son on her lap. About a half-hour in, five people with male names piled in. They shouted obscenities, and one person shared a sexually explicit image in front of everyone.
Cardetti brought the meeting to a close and started a new one that disabled screen sharing, kicking out two people who didn't seem to belong to the group. The participants spent five minutes essentially rolling their eyes.
"As a women's group, we were like, 'Well, this is frustrating, and this is why groups like us are here,'" she said. She said the meeting crashers might have been boys in high school who found the meeting link on Twitter.
There are ways to lower the chance of getting zoombombed. The company has been encouraging people to use features like a waiting room, a meeting lock and a limit on screen sharing:
On Wednesday, Cardetti received an email from Zoom saying that the company had enabled waiting rooms for her meetings by default, a few days ahead of schedule.
"We have decided to take this prudent action immediately to secure our education community," the company wrote in the email.
And on Friday Zoom changed a default setting for K-12 schools that accepted an offer to use free accounts without restrictions like a 40-minute meeting limit. For them, hosts are the only ones who can share screens by default, Zoom's Pelosi said. Zoom is evaluating whether to change that default setting for other types of users, she said.
Zoom also welcomes reports of zoombombing on its website.
"Those [zoombombs] are obviously incredibly unfortunate situations, and to see anybody leveraging this difficult time for so many in such a bad way definitely hurts us," Pelosi said. "That is not why we built this technology, for these things to be occurring."
Zoom has been working with other companies to stop groups of like-minded people from coming together to disrupt meetings. One group was found on YouTube and shut down, she said. The company has also asked YouTube to take down certain Zoom-related videos, a spokesperson told CNBC in an email.
Pelosi said that when it comes to sharing the links, the responsibility falls on users.
"If someone wants to put their link on Twitter, then things can happen," she said. "People can see that you're putting a link on a public forum."