“Risk is our business,” James T. Kirk once said. “That’s what this starship is all about. That’s why we’re aboard her.”
More than a half-century later, the performer who breathed life into the fabled Enterprise captain is, at age 90, making that kind of risk his own business and heading toward the stars under dramatically different circumstances than his fictional counterpart. And in doing so, William Shatner is causing worlds to collide, or at least permitting parallel universes to coexist — the utopian spacefaring vision of “Star Trek” and the evolving, increasingly commercial spot that “space” holds in the American psyche.
When Shatner boards Jeff Bezos’ Blue Origin NS-18 in Texas at around dawn Wednesday, his one small step into the craft creates one of the ultimate crossover stories of our era.
It’s about space and exploration, sure, and certainly about capitalism and billionaires and questions of economic equity. But it’s also about popular culture and marketing and entertainment and nostalgia and hope and Manifest Destiny and, and, and … well, you get the idea.
“What will I see when I’m out there?” Shatner wondered last week, talking to Anderson Cooper on CNN. An equally valid question is this: What will WE see when he’s out there?
It will be a complex blend of human dreams superimposed upon technology and hope, braggadocio and cash, and the notion that space travel elevates us — all orchestrated by a company under criticism for what some call the decidedly un-utopian, tech-bro ways that it operates.
Is all that and “Star Trek” a good fit?
THE WORLD OF `STAR TREK’
Since its 1966 premiere with one of the most diverse casts TV had ever seen, “Trek” has grown from Gene Roddenberry’s fever dream of a “‘Wagon Train’ to the stars” into an intricate transmedia universe full of subtleties and traditions and rules.
Among them: Human beings avoid killing each other. Money is generally outdated, as are hunger and poverty. Greed is aberrant. Noninterference in other cultures is the most sacred principle of all. And within the United Federation of Planets, the spacefaring United Nations of “Star Trek,” exploration, not domination, is the coin of the realm. In short, unlike a lot of humanity right now.
That 1966-69 original series used allegory to evade network censors and tell stories about racism and xenophobia and even the Vietnam War. How could they get away with all that? Because the adventures of Kirk’s Enterprise took place against a backdrop of 23rd-century space travel — something directly relevant to the world as well, given that humans first set foot on the moon 47 days after the original series’ final episode.
Over the next half century, backed by a vocal fan base, “Star Trek” roared back for more and, in the process, led the way in cementing space travel as an ideal canvas for relevant storytelling.
Even as NASA’s Apollo era ebbed into the space shuttle program (where an early craft was named “Enterprise”) and eventually into uncertainty, “Trek” remained one of the culture’s central vehicles for a spacefaring future.
Nichelle Nichols, who played Lt. Uhura on the show, was a particularly tireless advocate, working with NASA to recruit Americans of color and women and make sure they could occupy the center of such ambitions as the missions marched forward.
In the 1980s, movies about the original crew dealt with aging and regret. “Star Trek: The Next Generation” offered a more cerebral but still utopian vision. Another spinoff, “Star Trek: Deep Space Nine,” set at an outpost preserving a delicate detente, presented a darker take — but still one in which avarice was anomalous and worthy of scorn. And “Enterprise,” a 2001-2005 prequel, offered a season-long arc about the aftermath of a 9/11-style extraterrestrial attack on Earth.
Two of the latest iterations of the myth, “Star Trek: Discovery” and “Star Trek: Picard,” have dipped deeper into darkness than their predecessors and have toyed with the notion that not all humanity wants to be quite that utopian.
In all that varying storytelling, though, one constant remained: the notion that human space travel would become a vector of ethics and goodness that elevated the galaxy rather than plundered it.
THE PROFITABLE FRONTIER
Which brings us to companies like Blue Origin, Elon Musk’s SpaceX and Richard Branson’s Virgin Galactic — endeavors that build their brands not upon countries but corporations.
They offer the culture a late-stage capitalism variation on the theme — a narrative that space travel isn’t just for scientists and diplomats but for you and me, too. If, that is, you and me happen to have a few hundred thousand dollars or more of walking-around money on hand.
“The United States always has had private people working for the public purpose,” says Ravi S. Rajan, president of the California Institute for the Arts and a “Trek” fan since childhood. “But how much is done privately and how much is done publicly, that changes.”
Many have impugned the billionaire space moguls’ actions, including the secretary-general of the United Nations, and the troubles of Blue Origin’s corporate culture are well-documented of late.
But the motives of the Amazon founder himself remain unclear. It is evident, though, that the popular culture of space travel has influenced him deeply.
Bezos, who tells a story of exploring space to help ensure Earth’s continued prosperity, is a longtime “Trek” fan. He made a cameo as an alien Starfleet official in the 2016 movie “Star Trek Beyond.” And according to biographer Brad Stone, Bezos even fleetingly considered calling Amazon “Makeitso.com,” after Capt. Jean-Luc Picard’s favorite command in “Star Trek: The Next Generation.”
“The whole ethos of
Star Trek' showed people who were different-looking, with different skills, working together. We are in the opening moments of something like that," says Richard B. Cooper, vice president of the Space Foundation, a nonprofit that advocates for the global space industry. "People can look at this environment and say,Hey — I belong there, too.'”
Prohibitive costs aside (and that’s a big aside), Cooper has a point. Though the likes of Shatner may not be “regular people,” the shift from the dominance of the test pilot and the scientist tracks with the populism of our era, where — it must be said — the exactitude of science is being called into question as never before. And as Cooper points out, “it gives people hope. And if there’s one thing the world’s in short supply of, it’s that essential payload.”
That kind of storyline — hope, heroism, competitive dominance and an unerring sense of competence that can at times overlap with testosterone — could be one key reason why the commercial space outfits are thriving. At a moment when NASA and nation-focused space travel lacks a compelling Hollywood narrative, the entrepreneurs and their marketers step right in.
“American dominance in space, nobody cares about it. It’s Bezos who says, `We can’t go on living like this. We have to save the planet,'” says Mary-Jane Rubenstein, a professor of religion and science in society at Wesleyan University. What results, she says, is “a kinder, gentler colonialism” in which humans take to orbit under premises that seem justifiable but require closer scrutiny.
“It’s the billionaires who have the utopian visions,” says Rubenstein, author of the upcoming book “Astrotopia: The Dangerous Religion of the Corporate Space Race.”
“The states can’t muster them,” she says. “They have no story.”
We live in an era where the fictional and the real have an intricate relationship, and sometimes it’s hard to separate them. Something like this, a collision of dreams and real-life ambition and achievement, couldn’t have a more effective ambassador than the outsized personality that is William Shatner.
“I was there last week rehearsing, whatever they call it,” Shatner told Anderson Cooper.
“Training I think is what they call it,” Cooper said, to which Shatner responded: “I think of it as rehearsal.”
And there it is again — the storyline, compelling as ever, stealing oxygen from other important questions. Should we even be colonizing space? Don’t we have enough going on here at home to worry about? Aren’t there people with problems more pressing than this who could use the cash?
And what if we encounter life that’s not life as we know it, and harm it out of obliviousness or greed? It’s not as if that hasn’t happened countless times here on the ground, in the land that put a man on the moon but still grapples with a history brimming with horrors from slave markets to smallpox blankets. These are only some of the questions that will ascend and descend with Shatner on Wednesday.
Is it a stunt? Sure. Is it a genius marketing ploy? Absolutely. Is it cynical and self-aggrandizing and designed solely to make more money and grab more attention for the world’s richest man? You’re going to have to decide that one yourself.
In the meantime, consider the autobiographical song called “Real” that Shatner recorded in 2004 with country singer Brad Paisley.
“I’d love to help the world and all its problems. But I’m an entertainer, and that’s all,” he says in it. “So the next time there’s an asteroid or a natural disaster, I’m flattered that you thought of me — but I’m not the one to call.”
Turns out, he is — this time. But next time? In the future of the final frontier and the culture that has grown up around it — in this unusual realm where risk IS the business — that’s eventually going to have to be addressed.
Facebook dithered in curbing divisive user content in India
Facebook in India has been selective in curbing hate speech, misinformation and inflammatory posts, particularly anti-Muslim content, according to leaked documents obtained by The Associated Press, even as the internet giant’s own employees cast doubt over its motivations and interests.
Based on research produced as recently as March of this year to company memos that date back to 2019, internal company documents on India highlight Facebook’s constant struggles in quashing abusive content on its platforms in the world’s biggest democracy and the company’s largest growth market. Communal and religious tensions in India have a history of boiling over on social media and stoking violence.
The files show that Facebook has been aware of the problems for years, raising questions over whether it has done enough to address the issues. Many critics and digital experts say it has failed to do so, especially in cases where members of Prime Minister Narendra Modi’s ruling Bharatiya Janata Party are involved.
Across the world, Facebook has become increasingly important in politics, and India is no different.
Modi has been credited for leveraging the platform to his party’s advantage during elections, and reporting from The Wall Street Journal last year cast doubt over whether Facebook was selectively enforcing its policies on hate speech to avoid blowback from the BJP. Modi and Facebook chairman and CEO Mark Zuckerberg have exuded bonhomie, memorialized by a 2015 image of the two hugging at the Facebook headquarters.
The leaked documents include a trove of internal company reports on hate speech and misinformation in India that in some cases appeared to have been intensified by its own “recommended” feature and algorithms. They also include the company staffers’ concerns over the mishandling of these issues and their discontent over the viral “malcontent” on the platform.
According to the documents, Facebook saw India as one of the most “at risk countries” in the world and identified both Hindi and Bengali languages as priorities for “automation on violating hostile speech.” Yet, Facebook didn’t have enough local language moderators or content-flagging in place to stop misinformation that at times led to real-world violence.
In a statement to the AP, Facebook said it has “invested significantly in technology to find hate speech in various languages, including Hindi and Bengali” which “reduced the amount of hate speech that people see by half” in 2021.
“Hate speech against marginalized groups, including Muslims, is on the rise globally. So we are improving enforcement and are committed to updating our policies as hate speech evolves online,” a company spokesperson said.
This AP story, along with others being published, is based on disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions were obtained by a consortium of news organizations, including the AP.
Back in February 2019 and ahead of a general election when concerns of misinformation were running high, a Facebook employee wanted to understand what a new user in India saw on their news feed if all they did was follow pages and groups solely recommended by the platform itself.
The employee created a test user account and kept it live for three weeks, a period during which an extraordinary event shook India — a militant attack in disputed Kashmir had killed over 40 Indian soldiers, bringing the country close to war with rival Pakistan.
In the note, titled “An Indian Test User’s Descent into a Sea of Polarizing, Nationalistic Messages,” the employee whose name is redacted said they were “shocked” by the content flooding the news feed. The person described the content as having “become a near constant barrage of polarizing nationalist content, misinformation, and violence and gore.”
Seemingly benign and innocuous groups recommended by Facebook quickly morphed into something else altogether, where hate speech, unverified rumors and viral content ran rampant.
The recommended groups were inundated with fake news, anti-Pakistan rhetoric and Islamophobic content. Much of the content was extremely graphic.
One included a man holding the bloodied head of another man covered in a Pakistani flag, with an Indian flag partially covering it. Its “Popular Across Facebook” feature showed a slew of unverified content related to the retaliatory Indian strikes into Pakistan after the bombings, including an image of a napalm bomb from a video game clip debunked by one of Facebook’s fact-check partners.
“Following this test user’s News Feed, I’ve seen more images of dead people in the past three weeks than I’ve seen in my entire life total,” the researcher wrote.
The report sparked deep concerns over what such divisive content could lead to in the real world, where local news outlets at the time were reporting on Kashmiris being attacked in the fallout.
“Should we as a company have an extra responsibility for preventing integrity harms that result from recommended content?” the researcher asked in their conclusion.
The memo, circulated with other employees, did not answer that question. But it did expose how the platform’s own algorithms or default settings played a part in producing such objectionable content. The employee noted that there were clear “blind spots,” particularly in “local language content.” They said they hoped these findings would start conversations on how to avoid such “integrity harms,” especially for those who “differ significantly” from the typical U.S. user.
Even though the research was conducted during three weeks that weren’t an average representation, they acknowledged that it did show how such “unmoderated” and problematic content “could totally take over” during “a major crisis event.”
The Facebook spokesperson said the test study “inspired deeper, more rigorous analysis” of its recommendation systems and “contributed to product changes to improve them.”
“Separately, our work on curbing hate speech continues and we have further strengthened our hate classifiers, to include four Indian languages,” the spokesperson said.
NEW DELHI, India (AP)
Microsoft and Etisalat coact to develop 5G enterprise
Etisalat and Microsoft join forces to unleash its latest 5G enterprise by emerging Azure Multi-access Edge Compute, a power move to highlight the giant’s partnership with Emirate’s telco.
In a press release, Etisalat revealed that its recent collaboration with the Big Tech titan’s digital crime unit (DCU) will empower digital security in the Middle East to heighten security in the region against any cyber threats.
The union will harness both company’s capacities by partnering Etisalat Core Orchestration and Microsoft’s Azure ARM to develop a 5G driven edge computing plug-and-play framework for companies to leverage threat intelligence solutions.
“Azure MEC offers service providers and customers the same set of tools to build and manage their cloud infrastructure. Our customers can maximize their efforts by employing a ‘build once and deploy many’ strategy to optimize their investments,” regional director, Enterprise and Partner Group (EPG), Microsoft UAE, Naim Yazbeck, said in a statement.
Edge computing is a genre of computing created either on-site or in the vicinity of a specific data source, decreasing the necessity for data to be processed in a remote data center. The technology intends to optimize various industries, minimize latency, and support the complete hosting of applications to produce fast and safe 5G, Internet of Things (IoT), and artificial intelligence (AI) applications.
“Etisalat Edge Computing solutions will help customers transform the way they operate, especially transportation, smart manufacturing, logistics, and Oil and Gas,” vice president Fixed and Mobile Core at Etisalat, Khaled Al Suwaidi, said in a statement.
“This drastically increases the value for traditional networks to transition into 5G to develop intelligent and autonomous next-generation technology that unlocks potential opportunities to our customers,” he added.
In reference to the press release, the partnership between the tech companies is an extension of last year’s promising collaboration.
Last year, Etisalat joined forces with the Big Tech mogul to create a public cloud-first plan via a digital transformation program that allowed the Emirati telco to develop a platform intertwined with automation and AI.
Facebook’s oversight board seeks details on VIPs’ treatment
Facebook’s semi-independent oversight board says the company has fallen short of full disclosure on its internal system that exempts high-profile users from some or all of its content rules.
Facebook “has not been fully forthcoming” with the overseers about its “XCheck,” or cross-check, system the board said in a report Thursday. It also said it will review the system and recommend how the social network giant could change it.
The board started looking into the XCheck system last month after The Wall Street Journal reported that many VIP users abuse it, posting material that would cause ordinary users to be sanctioned — including for harassment and incitement of violence. For certain elite users, Facebook’s rules reportedly don’t seem to apply. There were at least 5.8 million exempted users as of last year, according to the Journal article.
Facebook is generally not bound under the oversight board’s rules to follow its recommendations.
“We believe the board’s work has been impactful, which is why we asked the board for input into our cross-check system, and we will strive to be clearer in our explanations to them going forward,” Facebook said in a statement Thursday.
The report said Facebook wrongly failed to mention the XCheck system when it asked the board earlier this year to rule on its ban on former President Donald Trump’s accounts following the Jan. 6 assault on the Capitol.
“Facebook only mentioned cross-check to the board when we asked whether Mr. Trump’s page or account had been subject to ordinary content-moderation processes,” the report said.
In May, the board upheld Facebook’s suspension of Trump’s accounts, which came out of concern that he incited violence leading to the Jan. 6 riot. But the overseers told Facebook to specify how long the suspension would last. Facebook later announced that Trump’s accounts would be suspended for two years, freezing his presence on the social network until early 2023, to be followed by a reassessment.
Trump announced Wednesday the launch of a new media company with its own social media platform. He said his goal is to create a rival to the Big Tech companies that have shut him out and denied him the megaphone that was paramount in his national rise.
Twitter, which was Trump’s platform of choice, banned him permanently after the Jan. 6 assault.
The oversight board said Thursday that for its review, Facebook agreed to provide the internal company documents on the XCheck system that were referenced in the Journal article. Facebook documents were leaked to the newspaper by Frances Haugen, a former product manager in the company’s civic integrity unit who also provided them to Congress and went public this month with a far-reaching condemnation of the company.
In a separate blog post, the board said Haugen has accepted its invitation for a meeting in coming weeks, to discuss her experiences “and gather information that can help push for greater transparency and accountability from Facebook through our case decisions and recommendations.”
Haugen’s accusations of possible serious harm to some young people from Facebook’s Instagram photo-sharing platform raised outrage among lawmakers and the public.
The board said in its report that in some cases, “Facebook failed to provide relevant information to the board, while in other instances, the information it did provide was incomplete.”
In a briefing to the board, “Facebook admitted it should not have said that (XCheck) only applied to a ‘small number of decisions,'” the report said. “Facebook noted that for teams operating at the scale of millions of content decisions a day, the numbers involved … seem relatively small, but recognized its phrasing could come across as misleading.”
Facebook created the oversight panel to rule on thorny content issues following widespread criticism of its problems responding swiftly and effectively to misinformation, hate speech and harmful influence campaigns. The board’s decisions have tended to favor free expression over the restriction of content. Its members include a former prime minister of Denmark and a former editor-in-chief of British newspaper the Guardian, along with legal scholars, human rights experts and journalists.
The board’s independence has been questioned by critics who say it’s a Facebook PR campaign intended to draw attention away from deeper problems of hate and misinformation that flourish on its platforms.
Facebook personnel were asked to restrain news
Chinese tech regulations morph U.S. firms’ course of action
China’s 5G smartphone rollout hits 70% in 2021
Vodafone adds 7,000 software engineers to target digital services
NEOM: A $500 Billion smart-city to be built in Saudi Arabia
5 Reasons Why… Telecoms is Important in Society
Advantages and drawbacks of Voice Recognition Technology
Telecom Sales Strategies that will Bring You Success in 2020
- Press Releases3 weeks ago
Comium Gambia workers are begging the president to interfere and save their families!
- Press Releases3 weeks ago
Comium surprised of PURA’s decision to suspend the network despite dues settlement and promising negotiations
- Press Releases4 days ago
Will Comium case be taken to another level, expose corruption and take authority heads to court?
- News3 weeks ago
In California, some buy machines that make water out of air
- Telecoms4 weeks ago
India’s relief package may not resurrect Vodafone Idea
- News3 weeks ago
Outage highlights how vital Facebook has become worldwide
- Ethical Tech4 weeks ago
Take a look at Facebook’s internal research over mental health on teens
- Views from the Inside4 weeks ago
Improving customer retention in telecoms: A digital-first mindset