A time bomb ?supercharged? by the pandemic: How white
nationalists are using gaming to recruit for terror
Experts are warning that far-right agitators are using online
gaming platforms to spread hate and recruit a new generation of
converts. Supercharged by the rise of gaming and social
isolation during the pandemic, extremism academics say more
needs to be done to police these platforms for grooming hate. Io
Dodds reports
Thursday 07 April 2022 18:48
The player's profile picture raised no red flags: just the
smiling Lego like-face of a typical Roblox avatar, little
different from the estimated 220 million people who log in at
least once a month to the wildly popular children's video gaming
platform.
On closer inspection, however, the player's "favourites" list
had been arranged into an impromptu mosaic with the words:
"Patriotic Front. Life, liberty, victory! Reclaim America!" The
Patriot Front is an American group of fascist street fighters,
who use "reclaim America" as their slogan.
The player was also part of an in-game group called Justice 4
Floyd, whose logo appeared to be based on the black shield of
Nazi German SS combat divisions in the Second World War. That
group was linked by "alliances" to other Roblox groups with
names such as the British Nationalist Vanguard, the Condor
Division (similar to the Nazis' Condor Legion), and the New
Hampshire 2nd Infantry Platoon, whose description bore
references to known neo-Nazi groups.
This is just one of the suspicious networks uncovered in popular
video games and gaming-related social networks by Alex Newhouse,
a researcher at the Middlebury Institute of International
Studies in Monterey, California. In a talk last week at the Game
Developers Conference in San Francisco, he and gaming
psychologist Rachel Kowert laid out evidence of how the gaming
boom of the pandemic era has given far-right extremists, who
have long been active in gaming communities, new opportunities
to recruit and organise.
Worse, Mr Newhouse argues that major gaming companies, in their
quest to attract and retain users, are expanding the features
that extremists can weaponise while failing to increase their
safety efforts at the same pace ? creating an online time bomb
that could lead to offline violence.
"As games are becoming more and more like social media platforms
themselves, with all the features that you would expect from a
Facebook or a Twitter, like groups and channels and friend lists
and all that, they become attractive to extremists who have been
deplatformed by Facebook and Twitter," he tells The Independent.
"We know that extremists are intentionally structuring these
networks to mobilise people to violence. They say as much, and
we've interviewed former extremists who talk about this... there
are individuals who are actively on the lookout or people they
think can be spun out into a mass shooter or a terrorist."
Extremists flock to Steam, Discord and Roblox
Gaming is no stranger to the far right. The hobby's large
quotient of disaffected and socially isolated young men has long
proved attractive to extremist groups, who have a historical
pattern of exploiting unexpected online services to spread their
message.
Mr Newhouse was a reporter at GameSpot in 2014 when resentment
over the increasing prominence of feminism and minority advocacy
in gaming communities exploded into a reactionary movement known
as Gamergate. The controversy spawned intense harassment
campaigns against female and non-white game developers and
galvanised the careers of activists and influencers who later
became key figures in the "alt right".
Gamergate also gave rise to 8chan, an online message board,
which has since became a central organising space for terrorists
across the world. This was where the Christchurch mosque gunman
in 2019 posted his manifesto, peppered with references to gaming
culture, and where "Q" ? the mysterious messiah of the QAnon
movement ? posted almost all of their conspiratorial
prophecies.
"In a cruel twist of fate, games critics and games media are
still today, in some ways, much better equipped to handle the
current landscape of extremism and disinformation than the
people who were covering terrorists in the 2000s, in early
2010s," says Newhouse, who later worked on data privacy at Sony
Playstation and is now deputy director of Middlebury's Center on
Terrorism, Extremism and Counterterrorism.
Last year, Mr Newhouse noticed an "increasing amount of chatter"
in extremist networks that suggested their members were moving
in large numbers to services such as Discord, a group chat app
popular with gamers, and Steam, an all-purpose online gaming
platform that combines store front, social network and games
library.
The timing coincided with a series of crackdowns by major social
networks in the wake of violent attacks linked to extremist
communities such as QAnon and Boogaloo, culminating in the
storming of the US Capitol in January 2021. Mr Newhouse suspects
that was one motive for the migration, though he cannot be sure.
Discord, he told the GDC audience, "has become probably the main
place for the initiation of someone from the early stages of
radicalisation into increasingly robust [far-right]
socialisation and identity". He says extremist communities often
use a Discord "server" ? effectively a linked group of
chatrooms, which can be public or private ? as a hub for
activity in various video games.
Why Discord? Partly because it's already popular, especially
with the white men and boys aged 15 to 22 who far right groups
tend to target. But Mr Newhouse also says: "Discord has shown
that it's relatively unwilling to take pretty significant action
against these groups.
?They're able to post under the radar well enough that they
don't get as much attention as the big Telegram or Facebook
networks... it's probably the most popular, least enforced
platform."
'Groomed via online gaming'
In 2017, Discord's staff learned that their app had been one of
the major organising places for the Unite the Right white
supremacist rally in Charlottesville, Virginia, at which an
avowed neo-Nazi killed one person and injured 35 in a vehicle
ramming attack.
At the time, Discord?s "trust and safety" team consisted of
one person. The event sparked change inside the company, and by
last May the safety team had swelled to about 60.
Yet the problem persists. An investigation in August by the
Institute for Strategic Dialogue (ISD), a British think tank,
found 24 extreme right servers on Discord; 100 such channels on
the livestreaming service DLive; 91 channels on Twitch, a
better-known livestreaming service owned by Amazon; and 45
public groups on Steam.
Steam appeared to host the most entrenched and long-lasting
networks, acting as a community hub, while Discord servers had
short life-cycles due to the company's crackdowns and were used
to provide "safe spaces" for young people to explore extremist
ideas and to coordinate harassment campaigns against minority
groups.
Meanwhile, journalists have found numerous examples of far right
groups building propaganda content in video games, such as
interactive Nazi concentration camps in Minecraft and Roblox.
The ISD's report also found evidence that several extreme right
Discord servers hosted many under-17s, "raising concerns that
the platform is being used for the radicalisation of minors". It
found limited evidence of extremists using gaming communities to
groom new members, concluding that most used them to bond with
fellow radicals and mobilise action.
Exit Hate, a British group that helps people leave extremist
movements, tells The Independent that it is currently mentoring
two people who were "groomed via online gaming".
A spokesperson said those people could not be interviewed
because Exit Hate recommends its mentees wait 12 months before
speaking to the press, but added: "Just over 70 per cent of
people we talk to have been recruited online, with a growing
number influenced by far-right gaming."
Although there isn't enough data to know how the pandemic has
affected this picture, mainstream social networks have suffered
sharp upticks in extremism. Moonshot, a British company that
works with tech firms and government agencies to design and
measure the performance of counter-extremism programmes, says it
saw surge in extremist activity across the board.
"We have consistently found a connection between social
isolation and extremism, so increased isolation during the
pandemic, we believe, also made people more vulnerable," says
Ross Frenett, Moonshot's co-founder and co-chief-executive, who
has helped draft reports on extremism in gaming communities for
the European Commission.
"The pandemic [also] allowed a number of different previously,
loosely connected or unconnected ideologies to fuse. The fusion
of some of the anti-vax conspiracy narratives with QAnon and
neo-Nazi ideas ? all that was supercharged during the
pandemic."
?When I heard about this, my jaw hit the floor?
For Dr Kowert, who is research director at the gaming-focused
mental health charity Take This, says such research was an
alarming wake-up call.
In article <YlPuM.35057$JbKe.6219@fx04.iad>, creon@creon.earth says...
A time bomb ?supercharged? by the pandemic: How white
nationalists are using gaming to recruit for terror
Experts are warning that far-right agitators are using online
gaming platforms to spread hate and recruit a new generation of
converts. Supercharged by the rise of gaming and social
isolation during the pandemic, extremism academics say more
needs to be done to police these platforms for grooming hate. Io
Dodds reports
Thursday 07 April 2022 18:48
The player's profile picture raised no red flags: just the
smiling Lego like-face of a typical Roblox avatar, little
different from the estimated 220 million people who log in at
least once a month to the wildly popular children's video gaming
platform.
On closer inspection, however, the player's "favourites" list
had been arranged into an impromptu mosaic with the words:
"Patriotic Front. Life, liberty, victory! Reclaim America!" The
Patriot Front is an American group of fascist street fighters,
who use "reclaim America" as their slogan.
The player was also part of an in-game group called Justice 4
Floyd, whose logo appeared to be based on the black shield of
Nazi German SS combat divisions in the Second World War. That
group was linked by "alliances" to other Roblox groups with
names such as the British Nationalist Vanguard, the Condor
Division (similar to the Nazis' Condor Legion), and the New
Hampshire 2nd Infantry Platoon, whose description bore
references to known neo-Nazi groups.
This is just one of the suspicious networks uncovered in popular
video games and gaming-related social networks by Alex Newhouse,
a researcher at the Middlebury Institute of International
Studies in Monterey, California. In a talk last week at the Game
Developers Conference in San Francisco, he and gaming
psychologist Rachel Kowert laid out evidence of how the gaming
boom of the pandemic era has given far-right extremists, who
have long been active in gaming communities, new opportunities
to recruit and organise.
Worse, Mr Newhouse argues that major gaming companies, in their
quest to attract and retain users, are expanding the features
that extremists can weaponise while failing to increase their
safety efforts at the same pace ? creating an online time bomb
that could lead to offline violence.
"As games are becoming more and more like social media platforms
themselves, with all the features that you would expect from a
Facebook or a Twitter, like groups and channels and friend lists
and all that, they become attractive to extremists who have been
deplatformed by Facebook and Twitter," he tells The Independent.
"We know that extremists are intentionally structuring these
networks to mobilise people to violence. They say as much, and
we've interviewed former extremists who talk about this... there
are individuals who are actively on the lookout or people they
think can be spun out into a mass shooter or a terrorist."
Extremists flock to Steam, Discord and Roblox
Gaming is no stranger to the far right. The hobby's large
quotient of disaffected and socially isolated young men has long
proved attractive to extremist groups, who have a historical
pattern of exploiting unexpected online services to spread their
message.
Mr Newhouse was a reporter at GameSpot in 2014 when resentment
over the increasing prominence of feminism and minority advocacy
in gaming communities exploded into a reactionary movement known
as Gamergate. The controversy spawned intense harassment
campaigns against female and non-white game developers and
galvanised the careers of activists and influencers who later
became key figures in the "alt right".
Gamergate also gave rise to 8chan, an online message board,
which has since became a central organising space for terrorists
across the world. This was where the Christchurch mosque gunman
in 2019 posted his manifesto, peppered with references to gaming
culture, and where "Q" ? the mysterious messiah of the QAnon
movement ? posted almost all of their conspiratorial
prophecies.
"In a cruel twist of fate, games critics and games media are
still today, in some ways, much better equipped to handle the
current landscape of extremism and disinformation than the
people who were covering terrorists in the 2000s, in early
2010s," says Newhouse, who later worked on data privacy at Sony
Playstation and is now deputy director of Middlebury's Center on
Terrorism, Extremism and Counterterrorism.
Last year, Mr Newhouse noticed an "increasing amount of chatter"
in extremist networks that suggested their members were moving
in large numbers to services such as Discord, a group chat app
popular with gamers, and Steam, an all-purpose online gaming
platform that combines store front, social network and games
library.
The timing coincided with a series of crackdowns by major social
networks in the wake of violent attacks linked to extremist
communities such as QAnon and Boogaloo, culminating in the
storming of the US Capitol in January 2021. Mr Newhouse suspects
that was one motive for the migration, though he cannot be sure.
Discord, he told the GDC audience, "has become probably the main
place for the initiation of someone from the early stages of
radicalisation into increasingly robust [far-right]
socialisation and identity". He says extremist communities often
use a Discord "server" ? effectively a linked group of
chatrooms, which can be public or private ? as a hub for
activity in various video games.
Why Discord? Partly because it's already popular, especially
with the white men and boys aged 15 to 22 who far right groups
tend to target. But Mr Newhouse also says: "Discord has shown
that it's relatively unwilling to take pretty significant action
against these groups.
?They're able to post under the radar well enough that they
don't get as much attention as the big Telegram or Facebook
networks... it's probably the most popular, least enforced
platform."
'Groomed via online gaming'
In 2017, Discord's staff learned that their app had been one of
the major organising places for the Unite the Right white
supremacist rally in Charlottesville, Virginia, at which an
avowed neo-Nazi killed one person and injured 35 in a vehicle
ramming attack.
At the time, Discord?s "trust and safety" team consisted of
one person. The event sparked change inside the company, and by
last May the safety team had swelled to about 60.
Yet the problem persists. An investigation in August by the
Institute for Strategic Dialogue (ISD), a British think tank,
found 24 extreme right servers on Discord; 100 such channels on
the livestreaming service DLive; 91 channels on Twitch, a
better-known livestreaming service owned by Amazon; and 45
public groups on Steam.
Steam appeared to host the most entrenched and long-lasting
networks, acting as a community hub, while Discord servers had
short life-cycles due to the company's crackdowns and were used
to provide "safe spaces" for young people to explore extremist
ideas and to coordinate harassment campaigns against minority
groups.
Meanwhile, journalists have found numerous examples of far right
groups building propaganda content in video games, such as
interactive Nazi concentration camps in Minecraft and Roblox.
The ISD's report also found evidence that several extreme right
Discord servers hosted many under-17s, "raising concerns that
the platform is being used for the radicalisation of minors". It
found limited evidence of extremists using gaming communities to
groom new members, concluding that most used them to bond with
fellow radicals and mobilise action.
Exit Hate, a British group that helps people leave extremist
movements, tells The Independent that it is currently mentoring
two people who were "groomed via online gaming".
A spokesperson said those people could not be interviewed
because Exit Hate recommends its mentees wait 12 months before
speaking to the press, but added: "Just over 70 per cent of
people we talk to have been recruited online, with a growing
number influenced by far-right gaming."
Although there isn't enough data to know how the pandemic has
affected this picture, mainstream social networks have suffered
sharp upticks in extremism. Moonshot, a British company that
works with tech firms and government agencies to design and
measure the performance of counter-extremism programmes, says it
saw surge in extremist activity across the board.
"We have consistently found a connection between social
isolation and extremism, so increased isolation during the
pandemic, we believe, also made people more vulnerable," says
Ross Frenett, Moonshot's co-founder and co-chief-executive, who
has helped draft reports on extremism in gaming communities for
the European Commission.
"The pandemic [also] allowed a number of different previously,
loosely connected or unconnected ideologies to fuse. The fusion
of some of the anti-vax conspiracy narratives with QAnon and
neo-Nazi ideas ? all that was supercharged during the
pandemic."
?When I heard about this, my jaw hit the floor?
For Dr Kowert, who is research director at the gaming-focused
mental health charity Take This, says such research was an
alarming wake-up call.
Foil hat time?
On 7/22/2023 10:18, Skeeter wrote:
In article <YlPuM.35057$JbKe.6219@fx04.iad>, creon@creon.earth says...
A time bomb ?supercharged? by the pandemic: How white
nationalists are using gaming to recruit for terror
Experts are warning that far-right agitators are using online
gaming platforms to spread hate and recruit a new generation of
converts. Supercharged by the rise of gaming and social
isolation during the pandemic, extremism academics say more
needs to be done to police these platforms for grooming hate. Io
Dodds reports
Thursday 07 April 2022 18:48
The player's profile picture raised no red flags: just the
smiling Lego like-face of a typical Roblox avatar, little
different from the estimated 220 million people who log in at
least once a month to the wildly popular children's video gaming
platform.
On closer inspection, however, the player's "favourites" list
had been arranged into an impromptu mosaic with the words:
"Patriotic Front. Life, liberty, victory! Reclaim America!" The
Patriot Front is an American group of fascist street fighters,
who use "reclaim America" as their slogan.
The player was also part of an in-game group called Justice 4
Floyd, whose logo appeared to be based on the black shield of
Nazi German SS combat divisions in the Second World War. That
group was linked by "alliances" to other Roblox groups with
names such as the British Nationalist Vanguard, the Condor
Division (similar to the Nazis' Condor Legion), and the New
Hampshire 2nd Infantry Platoon, whose description bore
references to known neo-Nazi groups.
This is just one of the suspicious networks uncovered in popular
video games and gaming-related social networks by Alex Newhouse,
a researcher at the Middlebury Institute of International
Studies in Monterey, California. In a talk last week at the Game
Developers Conference in San Francisco, he and gaming
psychologist Rachel Kowert laid out evidence of how the gaming
boom of the pandemic era has given far-right extremists, who
have long been active in gaming communities, new opportunities
to recruit and organise.
Worse, Mr Newhouse argues that major gaming companies, in their
quest to attract and retain users, are expanding the features
that extremists can weaponise while failing to increase their
safety efforts at the same pace ? creating an online time bomb
that could lead to offline violence.
"As games are becoming more and more like social media platforms
themselves, with all the features that you would expect from a
Facebook or a Twitter, like groups and channels and friend lists
and all that, they become attractive to extremists who have been
deplatformed by Facebook and Twitter," he tells The Independent.
"We know that extremists are intentionally structuring these
networks to mobilise people to violence. They say as much, and
we've interviewed former extremists who talk about this... there
are individuals who are actively on the lookout or people they
think can be spun out into a mass shooter or a terrorist."
Extremists flock to Steam, Discord and Roblox
Gaming is no stranger to the far right. The hobby's large
quotient of disaffected and socially isolated young men has long
proved attractive to extremist groups, who have a historical
pattern of exploiting unexpected online services to spread their
message.
Mr Newhouse was a reporter at GameSpot in 2014 when resentment
over the increasing prominence of feminism and minority advocacy
in gaming communities exploded into a reactionary movement known
as Gamergate. The controversy spawned intense harassment
campaigns against female and non-white game developers and
galvanised the careers of activists and influencers who later
became key figures in the "alt right".
Gamergate also gave rise to 8chan, an online message board,
which has since became a central organising space for terrorists
across the world. This was where the Christchurch mosque gunman
in 2019 posted his manifesto, peppered with references to gaming
culture, and where "Q" ? the mysterious messiah of the QAnon
movement ? posted almost all of their conspiratorial
prophecies.
"In a cruel twist of fate, games critics and games media are
still today, in some ways, much better equipped to handle the
current landscape of extremism and disinformation than the
people who were covering terrorists in the 2000s, in early
2010s," says Newhouse, who later worked on data privacy at Sony
Playstation and is now deputy director of Middlebury's Center on
Terrorism, Extremism and Counterterrorism.
Last year, Mr Newhouse noticed an "increasing amount of chatter"
in extremist networks that suggested their members were moving
in large numbers to services such as Discord, a group chat app
popular with gamers, and Steam, an all-purpose online gaming
platform that combines store front, social network and games
library.
The timing coincided with a series of crackdowns by major social
networks in the wake of violent attacks linked to extremist
communities such as QAnon and Boogaloo, culminating in the
storming of the US Capitol in January 2021. Mr Newhouse suspects
that was one motive for the migration, though he cannot be sure.
Discord, he told the GDC audience, "has become probably the main
place for the initiation of someone from the early stages of
radicalisation into increasingly robust [far-right]
socialisation and identity". He says extremist communities often
use a Discord "server" ? effectively a linked group of
chatrooms, which can be public or private ? as a hub for
activity in various video games.
Why Discord? Partly because it's already popular, especially
with the white men and boys aged 15 to 22 who far right groups
tend to target. But Mr Newhouse also says: "Discord has shown
that it's relatively unwilling to take pretty significant action
against these groups.
?They're able to post under the radar well enough that they
don't get as much attention as the big Telegram or Facebook
networks... it's probably the most popular, least enforced
platform."
'Groomed via online gaming'
In 2017, Discord's staff learned that their app had been one of
the major organising places for the Unite the Right white
supremacist rally in Charlottesville, Virginia, at which an
avowed neo-Nazi killed one person and injured 35 in a vehicle
ramming attack.
At the time, Discord?s "trust and safety" team consisted of
one person. The event sparked change inside the company, and by
last May the safety team had swelled to about 60.
Yet the problem persists. An investigation in August by the
Institute for Strategic Dialogue (ISD), a British think tank,
found 24 extreme right servers on Discord; 100 such channels on
the livestreaming service DLive; 91 channels on Twitch, a
better-known livestreaming service owned by Amazon; and 45
public groups on Steam.
Steam appeared to host the most entrenched and long-lasting
networks, acting as a community hub, while Discord servers had
short life-cycles due to the company's crackdowns and were used
to provide "safe spaces" for young people to explore extremist
ideas and to coordinate harassment campaigns against minority
groups.
Meanwhile, journalists have found numerous examples of far right
groups building propaganda content in video games, such as
interactive Nazi concentration camps in Minecraft and Roblox.
The ISD's report also found evidence that several extreme right
Discord servers hosted many under-17s, "raising concerns that
the platform is being used for the radicalisation of minors". It
found limited evidence of extremists using gaming communities to
groom new members, concluding that most used them to bond with
fellow radicals and mobilise action.
Exit Hate, a British group that helps people leave extremist
movements, tells The Independent that it is currently mentoring
two people who were "groomed via online gaming".
A spokesperson said those people could not be interviewed
because Exit Hate recommends its mentees wait 12 months before
speaking to the press, but added: "Just over 70 per cent of
people we talk to have been recruited online, with a growing
number influenced by far-right gaming."
Although there isn't enough data to know how the pandemic has
affected this picture, mainstream social networks have suffered
sharp upticks in extremism. Moonshot, a British company that
works with tech firms and government agencies to design and
measure the performance of counter-extremism programmes, says it
saw surge in extremist activity across the board.
"We have consistently found a connection between social
isolation and extremism, so increased isolation during the
pandemic, we believe, also made people more vulnerable," says
Ross Frenett, Moonshot's co-founder and co-chief-executive, who
has helped draft reports on extremism in gaming communities for
the European Commission.
"The pandemic [also] allowed a number of different previously,
loosely connected or unconnected ideologies to fuse. The fusion
of some of the anti-vax conspiracy narratives with QAnon and
neo-Nazi ideas ? all that was supercharged during the
pandemic."
?When I heard about this, my jaw hit the floor?
Warning! Always wear ANSI approved safety goggles when reading posts by Checkmate.
On Sat, 22 Jul 2023 11:48:08 GMT, Creon had the audacity to say the following:
TL;DR
When I got to the part about a "White Supremacist" group named "Justice 4 Floyd." I felt something yanking on my leg. Also was Lo Dodds maybe supposed to be Lou Dobbs? I'm not a gamer, but I'd bet this is a honeytrap with more F-I-BEE agents than January 6 and the Gretchen Whimire Entrapment caper combined.
On 7/22/2023 15:48, Checkmate <moderator-wida@baseball.bat> wrote:
Warning! Always wear ANSI approved safety goggles when reading posts by
Checkmate.
On Sat, 22 Jul 2023 11:48:08 GMT, Creon had the audacity to say the
following:
TL;DR
When I got to the part about a "White Supremacist" group named "Justice 4
Floyd." I felt something yanking on my leg. Also was Lo Dodds maybe
supposed to be Lou Dobbs? I'm not a gamer, but I'd bet this is a honeytrap >> with more F-I-BEE agents than January 6 and the Gretchen Whimire Entrapment >> caper combined.
It was Dark Brandon's Gazpacho Police funded by George Soros that set up
the honeypots where loyal rock ribbed god fearing real men and real
women of faith were tricked into eating fake meat grown in a peachtree
dish that indoctrinated their brains and made them guilty of violent >insurrection when all they really wanted to do was visit our beautiful >Capitol and enjoy prayerful and patriotic camaraderie (intercourse)
between like-minded individuals of Abrahamic (but not jewish or muslim) >faiths.
On Sun, 23 Jul 2023 08:47:00 -0400, mixed nuts <melopsitticus@undulatus.budgie> wrote:
On 7/22/2023 15:48, Checkmate <moderator-wida@baseball.bat> wrote:
Warning! Always wear ANSI approved safety goggles when reading posts by
Checkmate.
On Sat, 22 Jul 2023 11:48:08 GMT, Creon had the audacity to say the
following:
TL;DR
When I got to the part about a "White Supremacist" group named "Justice 4 >> Floyd." I felt something yanking on my leg. Also was Lo Dodds maybe
supposed to be Lou Dobbs? I'm not a gamer, but I'd bet this is a honeytrap
with more F-I-BEE agents than January 6 and the Gretchen Whimire Entrapment
caper combined.
It was Dark Brandon's Gazpacho Police funded by George Soros that set up >the honeypots where loyal rock ribbed god fearing real men and real
women of faith were tricked into eating fake meat grown in a peachtree
dish that indoctrinated their brains and made them guilty of violent >insurrection when all they really wanted to do was visit our beautiful >Capitol and enjoy prayerful and patriotic camaraderie (intercourse)
between like-minded individuals of Abrahamic (but not jewish or muslim) >faiths.
I made a fortune at the condom concession though.
Swill
Sysop: | Keyop |
---|---|
Location: | Huddersfield, West Yorkshire, UK |
Users: | 297 |
Nodes: | 16 (0 / 16) |
Uptime: | 115:39:48 |
Calls: | 6,662 |
Files: | 12,209 |
Messages: | 5,334,132 |