Are you interested in receiving a shorter, easy-to-scan, email of post excerpts? Check our our new
Stories from Wednesday, September 16th, 2020
EA To Rebrand Its Origin Platform As It Bows Out Of The PC Gaming Platform Wars
from the white-flag dept
by Timothy Geigner - September 16th @ 7:55pm
It has been a long and largely fruitless road for Origin, EA's PC gaming client that it had planned on building into a rival of Valve's Steam. What was originally supposed to have been the chief antagonist to Steam in the ongoing PC gaming platform wars instead is best described as a failure to launch. Released in 2011, Origin began life as it lived in total: the walled garden for most EA games. Critics appeared almost immediately, stemming from odious requirements to relinquish personal information, the use of DRM, and security flaws. Couple that with a game library that was relatively stilted compared with Steam, by design mind you, and it's not difficult to understand why the adoption numbers for the game client just never took off.
Several weeks ago, to the surprise of many, EA suddenly released its gaming catalog on Steam. Given the long history of the company keeping its toys for itself, it left many scratching their heads in confusion. This week, the inevitable occurred, with EA announcing that Origin will be no more. Instead, the PC gaming client will rebrand, rebuild, and become an optional place for EA gamers to play, rather than a Fort Knox for EA games.
EA has yet another piece of interconnected news to share: it's rebranding its Origin desktop app to simply be called the EA desktop app, alongside giving its PC platform a visual refresh.
Speaking to GamesIndustry.biz, EA SVP, strategic growth Mike Blank says the overhaul is intended "to create a more frictionless, fast, socially-oriented experience for our players, where it becomes the best place for them to connect with the people they want to play with in the games they want to play."
I'm frankly not used to giving EA a ton of kudos in these pages, but the overall strategy is a good one. The company appears to have finally realized that being permissive with gamers that just want to play the company's games is better business than trying to lock them into a failed client few want to use. The revamping of the UX was long needed, too, but the real star of the show here is that EA is looking to be more open in general.
"All of that is signaled by creating a common and consistent brand that is centered around EA and what EA stands for," Blank says. "And what signals it is this inflection about how EA stands for bringing your players together around the games they want to play on the platforms they want to play on. So yeah, it's not just a name change. It really signals an ethos that is critically important to us and that we know that's important to our players.
It's been a long journey for EA in this regard to where our games show up and where they don't. One of the things that we value is democratizing gaming, which is: how do you enable more people to play? And how do you make it easy for them to do so? And by bringing our games to Steam, we are doing just that. So whether we were there in the past or not, I look towards the future. And what I think today is that we are stronger and healthier. And I think we're responding more effectively to the needs of our players today than we ever have, and Steam is part of that journey."
Again, this is EA we're talking about, so it's going to take more than just the right words to convince most of us that this truly is a new direction for the company. Still, these are the right words. EA has long built a reputation for itself as being anti-consumer in many ways, but all of those ways come down to one thing: control. For a company with that history to suddenly start giving up that control, not out of surrender but out of a belief that it's good business, is a positive step.
from the reporting-on-hacking dept
by Copia Institute - September 16th @ 3:30pm
Summary: Late in June 2020, a leak-focused group known as "Distributed Denial of Secrets" (a.k.a., "DDoSecrets") published a large collection of law enforcement documents apparently obtained by the hacking collective Anonymous.
The DDoSecrets' data dump was timely, released as protests over the killing of a Black man by a white police officer continued around the nation neared their second consecutive month. Links to the files hosted at DDoSecrets' website spread quickly across Twitter, identified by the hashtag #BlueLeaks.
The 269-gigabyte trove of law enforcement data, emails, and other documents was taken from Netsential, which confirmed a security breach had led to the exfiltration of these files. The exfiltration was further acknowledged by the National Fusion Center Association, which told affected government agencies the stash included personally identifiable information. While this trove of data proved useful to activists and others seeking uncensored information about police activities, some expressed concern the personal info could be used to identify undercover officers or jeopardize ongoing investigations.
The first response from Twitter was to mark links to the DDoSecret files as potentially harmful to users. Users clicking on links to the data were told it might be unsafe to continue. The warning suggested the site might steal passwords, install malicious software, or harvest personal data. The final item on the list in the warning was a more accurate representation of the link destination: it said the link led to content that violated Twitter's terms of service.
Twitter's terms of service forbid users from "distributing" hacked content. This ban includes links to other sites hosting hacked content, as well as screenshots of forbidden content residing elsewhere on the web.
Shortly after the initial publication of the document trove, Twitter went further. It permanently banned DDoSecrets' Twitter account over its tweets about the hacked data. It also began removing tweets from other accounts that linked to the site.
Decisions to be made by Twitter:
Techdirt Podcast Episode 255: Threatcasting The Election
from the predicting-disinformation dept
by Leigh Beadon - September 16th @ 1:30pm
Late last year, we designed Threatcast 2020: a brainstorming game for groups of people trying to predict the new, innovative, and worrying forms of misinformation and disinformation that might come into play in the upcoming election. We ran a few in-person sessions before the pandemic hit and ended our plans for more, then last month we moved it online with the help of the fun interactive event platform Remo. We've learned a lot and hit on some disturbingly real-feeling predictions throughout these events, so this week we're joined by our partner in designing the game — Randy Lubin of Leveraged Play — to discuss our experiences "threatcasting" the 2020 election. We really want to run more of these online events for new groups, so if that's something you or your organization might be interested in, please get in touch!
Follow the Techdirt Podcast on Soundcloud, subscribe via iTunes or Google Play, or grab the RSS feed. You can also keep up with all the latest episodes right here on Techdirt.
That's A Wrap: Techdirt Greenhouse Content Moderation Edition
from the building-a-better,-more-ethical-internet dept
by Karl Bode - September 16th @ 12:03pm
When we launched Techdirt Greenhouse, we noted that we wanted to build a tech policy forum that not only tackled the thorniest tech policy issues of the day, but did so with a little more patience and nuance than you'll find at many gadget-obsessed technology outlets. After our inaugural panel tackled privacy, we just wrapped on our second panel subject: content moderation. We'd like to thank all of those that participated in the panel, and all of you for reading.
You'd be hard pressed to find a thornier, more complicated subject than content moderation. On one hand, technology giants have spent years prioritizing ad engagement over protecting their user base from malicious disinformation and hate speech, often with fatal results. At the same time, many of the remedies being proposed cause more harm than good by trampling free speech, or putting giant corporations into the position of arbiters of acceptable public discourse. Moderation at this scale is a nightmare. One misstep in federal policy and you've created an ocean of new problems.
Whether it's the detection and deletion of live-streaming violence, or protecting elections from foreign and domestic propaganda, it's a labyrinthine, multi-tendriled subject that can flummox even experts in the field. We're hopeful that this collection of pieces helped inform the debate in a way that simplified some of these immensely complicated issues. Here's a recap of the pieces from this round in case you missed them:
Much like the privacy debate, crafting meaningful content moderation guidelines and rules (and ensuring consistent, transparent enforcement) was a steep uphill climb even during the best of times. Now the effort will share fractured attention spans and resources with an historic pandemic, recovering from the resulting economic collapse, and addressing the endless web of socioeconomic and political dysfunction that is the American COVID-19 crisis. But, much like the privacy debate, it's an essential discussion to have all the same, and we hope folks found this collection informative.
Again, we'd like to thank our participants for taking the time to provide insight during an increasingly challenging time. We'd also like to thank Techdirt readers and commenters for participating. In a few weeks we'll be announcing the next panel; one that should prove timely during an historic health crisis that has forced the majority of Americans to work, play, innovate, and learn from the confines of home.
from the so-many-arguments dept
by Mike Masnick - September 16th @ 10:50am
There have been a variety of lawsuits filed regarding Trump's silly Executive Order regarding TikTok, but one interesting one involves an employee of TikTok, Patrick Ryan, who filed suit on his own behalf to try to block the Executive Order from going into effect. A key part of Ryan's argument is that since the executive order bans transactions, it would mean his own salary from TikTok's parent company, ByteDance, might be blocked by the US government.
It is impossible to know now whether the Commerce Department will exempt the payment of wages and salaries from the dictates of the Executive Order, and Plaintiff will not know until the day the order is to take effect, but any plain reading of the language of the order would include the payment of wages and salaries to U.S. employees of TikTok within that definition
As such, Ryan asked the court to issue a Temporary Restraining Order to block the Executive Order from actually going into effect on September 20th. There's more to the lawsuit than that, but the DOJ responded to say "we won't block employee salaries."
The Department of Commerce can state that it does not intend to implement or enforce Executive Order 13942 in a manner which would prohibit the payment of wages and/or salaries to Plaintiff or any other employee or contractor of TikTok.
The Department of Commerce can state that it does not intend to implement or enforce Executive Order 13942 in a manner which would prohibit the provision of benefits packages to Plaintiff or any other employee of TikTok.
The Department of Commerce can state that it does not intend to implement or enforce Executive Order 13942 in a manner which would result in the imputation of civil or criminal liability to Plaintiff or any other employee or contractor of TikTok for performing otherwise lawful actions that are part of their regular job duties and responsibilities.
That caused Ryan's lawyers to declare at least an initial victory:
This morning, the Government advised Plaintiff’s counsel and later the Court that it in fact will not apply the Executive Order to the payment of TikTok wages, salaries or benefits, or impose civil or criminal sanctions against them for doing their jobs, thereby mooting the need to seek a temporary restraining order against the Government to protect the TikTok employees.
We are pleased that our litigation was able to achieve this fantastic result for the thousands of TikTok employees around the world, and we are confident that the remaining issues in this case also will be litigated fully to a successful conclusion, which will be the striking of the Executive Order as a unconstitutional overreach by this U.S. President.
However, it also made it easy for the judge to then deny the requested TRO:
Ryan’s application for a temporary restraining order is denied for two related reasons. First, there is a serious question about whether this Court has jurisdiction to issue a temporary restraining order at this point in time. It seems unlikely that the conflict between Ryan and the federal government has ripened into a true “case or controversy” within the meaning of Article III of the United States Constitution. Babbitt v. United Farm Workers National Union, 442 U.S. 289, 297 (1979). Whether Ryan could actually face prosecution for getting a paycheck from TikTok depends on a number of uncertain conditions. As a foundational matter, the President may only exercise his emergency powers to block transactions with a foreign-owned entity. ByteDance is widely reported to be in negotiations to alter its ownership structure in a manner that could result in non-enforcement of the Executive Order.
Even if that fails, the Secretary of Commerce would need to include payments to employees on the list of prohibited transactions. And then there would need to be real risk that federal government would actually start prosecuting TikTok employees for receiving paychecks. That is an unlikely chain of events—indeed, yesterday the government filed a notice in this case specifying that the Department of Commerce “does not intend to implement or enforce [the Executive Order] in a manner which would prohibit the payment of wages and/or salaries to Plaintiff or any other employee or contractor of TikTok.” It is thus doubtful—at least at this time—that Ryan’s alleged fear that he faces prosecution is reasonable.
The second reason for denying the temporary restraining order is that, even if the Court presently has jurisdiction, Ryan has not demonstrated that he is likely to suffer irreparable harm absent an immediate ruling. His vague allegation that he would suffer reputational harm from the government’s implementation of the Executive Order against TikTok certainly does not suffice. Ulrich v. City and County of San Francisco, 308 F.3d 968, 982 (9th Cir. 2002) (citing Paul v. Davis, 424 U.S. 693, 701, 711 (1976)). And to the extent Ryan seeks to protect a future paycheck (or to protect against prosecution for receiving money that TikTok owes him for work performed), that protection could be readily provided at a later date, if and when the possibility of losing it becomes more concrete.
Of course, many of these cases may be moot, should the Treasury Department decide that the weird non-sale to Oracle solves any "problems" for TikTok.
Of course, there's still another lawsuit from a bunch of WeChat users about the Executive Order, and since there's no attempt to sell WeChat... that case may have a longer lifespan, but we'll cover that in another post (stay tuned).
Daily Deal: Interactive Learn to Code Bundle
from the good-deals-on-cool-stuff dept
by Daily Deal - September 16th @ 10:45am
The Interactive Learn to Code Bundle has 9 courses designed to help you learn to code and to write programs. The courses cover SQL, JavaScript, jQuery, PHP, Python, Bootstrap, Java, and web design. Each concept is explained in-depth, and uses simple tasks to help you cement your newly gained knowledge with some hands-on experience. It's on sale for $30.
Note: The Techdirt Deals Store is powered and curated by StackCommerce. A portion of all sales from Techdirt Deals helps support Techdirt. The products featured do not reflect endorsements by our editorial team.
The TikTok Oracle Grift: Insiders Admit They Went Hunting For A Tech Company The President Liked
from the because-of-course dept
by Mike Masnick - September 16th @ 9:43am
Earlier this week we wrote about the absolute grift involved in the TikTok / Oracle deal. Contrary to the framing that this was Oracle "buying" TikTok to satisfy the President's unconstitutional demand that the Chinese company ByteDance sell TikTok to an American company, the story showed that this was just a hosting deal for Oracle's cloud service, which is way down the list of top cloud providers.
The end result was no actual sale (though the Treasury Department is still "reviewing" the deal), but a big contract for Oracle, and a bogus story in which the President can pretend he forced ByteDance to "sell" TikTok, even though it retains ownership in the company (there are some rumors that the hosting deal will include a small, and probably symbolic, equity stake for Oracle).
The other key point I noted in my article was that Oracle's executive leadership, starting with Larry Ellison, but including CEO Safra Catz, have been cozying up to Trump and the White House ever since Trump became President. While much of Silicon Valley's executive teams have made it quite clear how uncomfortable they are with a Trump Presidency, Oracle... has done the opposite. And while I framed it as being convenient that things worked out this way, a report from the Wall Street Journal highlights how this was the grift from day one.
Oracle was originally brought into the negotiations to provide an alternative to Microsoft Corp., a rival bidder with Walmart as a partner, said one person familiar with the talks. The U.S. investment firms Sequoia Capital and General Atlantic, which are existing investors in ByteDance, went in search of a tech company with close ties to the administration and settled on Oracle, the person said.
As we discussed earlier, from day one of this silly ordeal, the President made it clear he would steer a deal to a company he personally liked (meaning one where the leadership licked his boots, apparently). After all, after first saying that he was going to block TikTok in the US, as rumors of Microsoft's involvement came out, Trump immediately said that he would block a deal with Microsoft. Microsoft had to go grovelling to Trump over a weekend to have him say, grudgingly, they could continue to pursue a deal.
Then, of course, there's the Walmart story. Rumors came out that Walmart was a leading bidder, but Trump's White House stepped in and told them they would block that deal too, because it would show that Trump's figleaf "national security" explanation for banning TikTok was an obvious lie.
In other words, throughout this nonsense process, the President -- who at one point also demanded a finder's fee for any deal -- made it clear that this was about his own whims. He was forcing a sale, and he was going to pick a winner -- and the winner had to be someone he liked.
That doesn't seem like the kind of free market people appreciate. It seems like the very worst of a corrupt crony capitalist system.
The WSJ piece details how being corrupt this whole process was. It was all about stroking Trump's ego, and lining his campaign's bank account:
Sequoia Capital used its own connections to push the administration to allow TikTok to continue operating in the U.S., say people familiar with the discussions. Doug Leone, the firm’s global managing partner who took a lead role lobbying on TikTok’s behalf, has donated tens of thousands of dollars to Republican candidates this election season along with his wife, including to the president’s re-election campaign. Mr. Leone also held a reception at his Silicon Valley home for Secretary of State Mike Pompeo in January.
Ms. Catz’s relationship to Mr. Trump could give Oracle a leg up in getting the deal approved, according to people in contact with the White House. Oracle has worked for decades with the U.S. government, including multiple contracts with the national-security establishment.
Once again, this was nothing but performative nonsense from the President that ends in his own supporters lining their pockets. None of it seems to have anything to do with actual national security. It's a joke.
Josh Hawley Isn't 'Helping' When It Comes To TikTok
from the sound-and-fury,-signifying-nothing dept
by Karl Bode - September 16th @ 6:26am
It's the dumb saga that only seems to get dumber. Earlier this week, we noted that Trump's dumb and arguably unconstitutional order banning TikTok had resulted in (surprise) Trump friend and Oracle boss Larry Ellison nabbing a cozy little partnership for his fledgling cloud hosting business. Granted the deal itself does absolutely nothing outside of providing Oracle a major client. It's more cronyism and heist than serious adult policy, yet countless outlets still somehow framed the entire thing as somehow meaningful, ethical, and based in good faith (it's none of those things).
Senator Josh Hawley, one of the biggest TikTok pearl clutchers in Congress, obviously didn't much like the deal. Hawley sent an open letter to Treasury Secretary Steve Mnuchin calling the deal "completely unacceptable" and demanding an outright ban:
Reports say @tiktok_us has reached a deal with an American company. I’m urging the Trump Administration NOT to approve it unless it involves a clean break w/ @BytedanceTalk & total separation from #Beijing pic.twitter.com/Ciz5UovsYK
— Josh Hawley (@HawleyMO) September 14, 2020
Hawley's major complaint is correct in that the deal does absolutely nothing to thwart Chinese intelligence from collecting TikTok data since ByteDance would still own TikTok and control all algorithms:
"CFIUS should promptly reject any Oracle-ByteDance collaboration, and send the ball back to ByteDance’s court so that the company can come up with a more acceptable solution. ByteDance can still pursue a full sale of TikTok, its code, and its algorithm to a U.S. company, so that the app can be rebuilt from the ground up to remove any trace of CCP influence."
Here's the thing that Hawley, and every other TikTok pearl clutcher can't or won't understand: even a full ban of TikTok doesn't meaningfully thwart Chinese intelligence. Why? U.S. privacy and security standards are a joke. Sectors like telecom, adtech, and apps are such a poorly regulated dumpster fire (when they see any oversight at all), China can simply buy or steal this (and so much more) data from an absolute ocean of dodgy information brokers and middlemen.
Banning TikTok to protect U.S. consumer privacy is like spitting on a wildfire then patting yourself on the back for being an incredible firefighter. The real solutions to these problems require taking a far smarter, broader, more holistic view. That means passing a meaningful privacy law, shoring up election reform, adequately funding privacy regulators, passing some standards for the IOT, adequately securing decade-old U.S. network vulnerabilities, mandating transparency in the adtech, telecom, and other sectors, and better policing the collection and sale of U.S. location and other data. Fix the broader problem(s), and TikTok becomes a detail.
Hawley not only doesn't seem to understand that, he's actively opposed to many of these broader reform efforts.
Hawley, much like Marsha Blackburn or Tom Cotton, oddly adores freaking out when China is involved, but is either absent from -- or detrimental to -- efforts to shore op overall U.S. privacy and security standards and oversight. Blackburn, Cotton, or Hawley don't make so much as a peep when U.S. telecom providers get mired in privacy scandals. They've said nary a word about the dodgy adtech sector and the way it sells access to U.S. user location data to any moron with a nickel. They've actively opposed election security reform, adequately funding or staffing the FTC, or passing even the most basic of privacy rules.
And yet when a Chinese company develops a product that outperforms the best Silicon Valley has to offer, there are months upon months of absolute and total "security and privacy" hysteria. It's just weird how, for some folks, security and privacy only seem to matter when foreigners are involved. It's performative, xenophobic, wildly inconsistent, and largely just stupid. Either you genuinely care about U.S. security and privacy or you don't. Showing up late, crying about China, then disappearing entirely when broader solutions are recommended isn't "helping," it's performative histrionics.
from the too-much-is-never-enough dept
by Glyn Moody - September 16th @ 3:27am
The passage of the EU Copyright Directive last year represented one of the most disgraceful examples of successful lobbying and lying by the publishing, music, and film industries. In order to convince MEPs to vote for the highly controversial legislation, copyright companies and their political allies insisted repeatedly that the upload filters needed to implement Article 17 (originally Article 13) were optional, and that user rights would of course be respected online. But as Techdirt and many others warned at the time, this was untrue, as even the law's supporters admitted once it had been passed. Now that the EU member states are starting to implement the Directive, it is clear that there is no alternative to upload filters, and that freedom of speech will therefore be massively harmed by the new law. France has even gone so far as ignore the requirement for the few user protections that the Copyright Directive graciously provides.
The EU Copyright Directive represents an almost total victory for copyright maximalists, and a huge defeat for ordinary users of the Internet in the EU. But if there is one thing that we can be sure of, it's that the copyright industries are never satisfied. Despite the massive gains already enshrined in the Directive, a group of industry organizations from the world of publishing, music, cinema and broadcasting have written to the EU Commissioner responsible for the Internal Market, Thierry Breton, expressing their "serious concerns regarding the European Commission's consultation on its proposed guidance on the application of Article 17 of the Directive on Copyright in the Digital Single Market ("the Directive")." The industry groups are worried that implementation of the EU Copyright Directive will provide them with too little protection (pdf):
We are very concerned that, in its Consultation Paper, the Commission is going against its original objective of providing a high level of protection for rightsholders and creators and to create a level playing field in the online Digital Single Market. It interprets essential aspects of Article 17 of the Directive in a manner that is incompatible with the wording and the objective of the Article, thus jeopardising the balance of interests achieved by the EU legislature in Article 17.
In an Annex to the letter, the copyright industries raise four "concerns" with the proposed guidance on the implementation of Article 17. The former MEP Julia Reda, who valiantly led the resistance against the worst aspects of the Copyright Directive during its passage through the EU's legislative system, has answered in detail all of the points in a thread on Twitter. It's extremely clearly explained, and I urge you to read it to appreciate the full horror of what the copyright companies are claiming and demanding. But there is one "concern" of the copyright maximalists that is so outrageous that it deserves to be singled out here. Reda writes:
#Article17 clearly says that legal content must not be blocked. #Uploadfilters can't guarantee that, so rightholders claim that this is fulfilled as long as users have the right to complain about wrongful blocking *after* it has already happened.
This completely goes against what users fought for in the negotiations and what #Article17 says, that it "shall in no way affect legitimate uses". Of course, if all legal parodies, quotes etc. get automatically blocked by #uploadfilters, legitimate uses are affected pretty badly.
The copyright companies and their political friends tricked the European Parliament into voting through Article 17 by claiming repeatedly that it did not require upload filters, which were rightly regarded as unacceptable. Now, the companies are happy to admit that the law's requirement to assess whether uploads are infringing before they are posted -- which can only be done using algorithms to filter out infringing material -- is "practically unworkable". Instead, they want blocking to be the default when there is any doubt, forcing users to go through a process of complaining afterwards if they wish their uploads to appear. Since most people will not know how to do this, or won't have the time or energy to do so, this will inevitably lead to vast amounts of legal material being blocked by filters.
As Reda rightly summarizes:
The entertainment industry is joining forces to push for the worst possible implementation of #Article17, which would not only require out-of-control #uploadfilters without any safeguards, but also violate fundamental rights AND the very text of Article 17 itself.
The EU Copyright Directive's Article 17 already promises to be disastrous for user creativity and freedom of speech in the EU; unfortunately, the proposed EU guidance has some additional aspects that are problematic for end users (pdf), as a group of civil society organizations point out in their own letter to the EU Commissioner. What the industry's demands show once again is that no matter how strong copyright is made, no matter how wide its reach, and no matter how disproportionate the enforcement powers are, publishing, music, film and broadcasting companies always want more. Their motto is clearly: "too much is never enough".
This mailing list is announce-only.
Floor64 will not share your email address with third parties.