Wednesday, February 21, 2018

The Humans and the Bots

Had a thought today about the world in which we currently live. It was poker-related, too -- in fact, online poker-related -- so I figured I might share it here.

Post-Black Friday my online poker playing essentially dwindled to some half-hearted noodling on a couple of the small, remaining sites, then disappeared entirely save the occasional play money game on PokerStars.

Not long ago I got an account on this new site called Coin Poker. It went live in November, and I believe it was sometime in December or maybe early January when I hopped on there for the first time. The site is “powered by blockchain technology via Ethereum,” and in fact the games are played with a newly invented cryptocurrency, “Chips” or “CHP” (now listed on a couple of exchanges).

The site had an ICO (Initial Coin Offering) -- actually a “pre-ICO” and then two stages of ICOs -- in which a good chunk of these CHPs were sold for Ethereum. Meanwhile the site has been conducting tournament freerolls to give away the rest of the CHPs. There were a lot of those early on, though the schedule has thinned a little lately.

It’s through the freerolls that I won some CHPs and began a modest “bankroll” on the site, something with which to play in the “cash” games. I haven’t explored where exactly things stand as far as depositing and withdrawing are concerned, and don’t really anticipate doing so soon (unless perhaps I were to run my small total up significantly).

Playing on the site has been diverting, though, and for the first time in several years I have found myself genuinely invested in the games when playing poker online. I’ve even revived some of those earlier online poker memories of pleasure and pain associated with wins and losses, to a lesser degree of course.

When I first started on the site, I’d join the freerolls which like all the games on the site are played either four-handed or six-handed. Very frequently there would be players at the table shown as sitting out, something I grew accustomed to quickly. At a six-handed table there might be three or four seats occupied by the non-playing entrants, and occasionally at four-handed tables I might be the only live one there just scooping up blinds and antes until the field got whittled down.

At the time I assumed the site was filling the empty seats with these “dummy” players just to make the freerolls last a little longer, or perhaps to foster the impression of more traffic than there really was. Whatever the reason, I haven’t noticed the sitter-outers as much lately, or at all, really. As the site has grown a bit more popular, I imagine if there were such a strategy employed before it has now been withdrawn. (I’m only speculating.)

I wasn’t bothered too much by all the players sitting out, although the presence of all of those silent “zombies” at the table did cause me to recall the controversies and occasional hysteria surrounding the use of “bots” in online poker. Coupled with some of the news from the past few weeks (and months), that in turn has made me think about the significant influence such software applications running automated tasks or scripts online now have upon our lives.

It’s an enormous subject, but in particular I’m thinking about those indictments handed down last Friday by Special Counsel Robert Mueller that charge 13 Russian nationals and three Russian entities with conspiracy to defraud the United States via their attempts to meddle with the 2016 U.S. presidential election. If you’ve read through the 37-page document spelling out what happened (or heard it summarized), you’re familiar with some of the methods employed by these agents to manipulate news and opinion consumed by Americans during the campaign, especially via social media.

The report describes in detail how a Russian company called the “Internet Research Agency” (a name sounding equally generic and sinister) employed hundreds to help generate content published via fake accounts with invented personas on YouTube, Facebook, Instagram, and Twitter, content that was in turn disseminated far and wide “via retweets, reposts, and similar means.”

The network has been characterized as a “bot farm” and even this week there was evidence of the network or something similar continuing to operate via the rapid spread of various messages (including false ones) in the wake of the deadly school shooting in Lakeland, Florida a week ago.

One of the more curious aspects of the “disinformation operation” (as some have described it) is the way invented news and opinion gets picked up and further distributed by unsuspecting social media users (i.e., Americans not involved with the operation). The indictment describes “unwitting members, volunteers, and supporters” of the campaign the Russians were supporting as having performed such work along with others “involved in local community outreach, as well as grassroots groups.”

In other words, certain messages and information “campaigns” begun by this Internet Research Agency were initially promulgated by a vast number of fake accounts with programs or “bots” helping extend their reach and influence. Then actual, living and breathing humans receiving those messages (and unaware of or unconcerned about their origin) passed them along as well, increasing their audience and influence.

Setting aside questions of legality and jurisdiction (and ignoring entirely the many other areas being explored by Mueller and his team), I just want to isolate that phenomenon of an automated message sent via a “bot” being received and then resent by a human. The fake accounts being directed by the scripts are simply executing commands. The humans who then receive and resend those messages do so consciously, although they, too, act by rote in a sense, simply hitting “like” and “retweet” in what is often an uncritical fashion. (Bot-like, you could say, depending on your predilection for irony.)

When playing against the “dummy” non-players in those freerolls, I could comfortably bet or raise against them every single time, knowing full well that even though they might resemble “human” players sitting there at the table, they weren’t going to play back at me. They were programmed simply to fold every time the action was on them. If you’ve ever played against “the computer” in crude games (including poker games), you’ve probably similarly been able to pick up on the program’s patterns and exploit them to your favor.

Of course, increasingly sophisticated programs have been created to run much more challenging poker playing “bots,” including those powered by artificial intelligence. These programs can in fact exploit the tendencies of humans who often find it very difficult to randomize their actions and thereby avoid detectable patterns. It’s much harder to know what to do against these, as some of the more recent efforts in this area have demonstrated.

Many of those who forwarded along memes, photos, articles, and other bot-created content during the 2016 presidential campaign weren’t aware of the original source of that information (were “unwitting”). They were -- and are, still -- being exploited, in a way, by others who know how they tend to “play” when using social media.

The “game” is getting a lot harder. As far as social media is concerned -- and news and politics and everything else in our lives that has now become so greatly influenced by message-delivering mechanism of social media -- it’s becoming more and more difficult to know who is human and who is a bot pretending to be human.

Especially when the humans keep acting like the bots.

(EDIT [added 3/19/18]: Speaking of CoinPoker and bots, there’s an interesting new article on PartTime Poker sharing some research regarding the site’s unusual traffic patterns. The title gives you an idea of the article’s conclusion -- "CoinPoker’s Traffic is a Farce.")

Image: “Reply - Retweet- Favorite” (adapted), David Berkowitz. CC BY 2.0.

Labels: , , , , , , , , , ,

Newer Posts
Older Posts

Copyright © 2006-2021 Hard-Boiled Poker.
All Rights Reserved.