• 32 Posts
  • 50 Comments
Joined 2Y ago
cake
Cake day: Aug 24, 2020

help-circle
rss

Why can’t we have both?


And yet aren’t able to shut down all the CP on this fucking platform, just yesterday an idiot came into a group I manage openly asking for CP. If telegram is gonna use mass surveillance, at least make it useful god damn.


The US will push you to be homeless, and then jail you for that.


What the hell did i just read?


We know that has been happening for so long. Tho is kinda scary.


Thanks, i looked at it and the server its great. Although the group i created is going well.


I created a Matrix Space for trans, NB, and gender non-conforming people.
Hi, I've been in a trans chat group on matrix for a while, but moderation is non-existing and chasers run free in there, so I created a Safe Space for trans and NB people, and ill be moderating it (I'm looking for additional moderators). There is also a private NSFW room, also exclusively for trans and nb people, but you have to ask for the link to it. Feel free to join. https://matrix.to/#/#trans-save-space:matrix.org
fedilink

I know nothing about lol (except for the music wich is always dope) I tried to play many years ago but didnt go nowhere

Yet I absolutely love arcane. The animation, the story, everything is so fucking perfect. And I love Jinx, 100% transition goal.

I wanna know more about the lore, it looks so good



At this point they should just ask help to Anthony since Linus has no idea what he is doing. Although Luke is having a better experience and not too many complains. In a recent WAN show Luke said that he likes Linux more for working since “it gets less in the way that windows”


In my country people are worst, they say “No sea marico, déjelo que piense lo que quiera” (dont be a removed, let them think what they want).

Ive heard that phrase in every context you can imagine, from pointing out offensive jokes and stereotypes, to avoid conversations about if one specific genocide is justified. People here are assholes, they totally ignore the “We cant tolerate intolerant people in order to have a democratic system” principle.


I fucking hate to have a post deleted on Reddit because of some stupid formating rule they have.




Just a few days ago I was looking for a Markdown editor that allows me to export to off right away.

It does that and also allows to export as HTML and JPEG.

It is amazing really.



Didnt know that blender had their own peertube. Looks great.

I love to see a new movie from blender studio. I expect them more than Pixar’s


Just practice more. Find someone to talk to and engage is casual conversations. You’ll get better over time.


Oh shit I need one of those but with Latin America. I guess will be all red. Average internet speed in Venezuela doesn’t even reach 10 Mbps


That actually looks good. Lets see how people use it



Bee movie was a warning


What opinions you have on the Facebook Rebranding?
You think is Facebook trying to just wash their hands as Facebook itself will become part of a bigger conglomerate created by themselves?
fedilink

A new study by Canadian researchers adds weight to the theory that the evolutionary role of gay men may be to serve as “super uncles” who help close family members survive. Paul Vasey, an evolutionary psychologist at the University of Lethbridge, sought to address an entrenched scientific riddle: If homosexuality appears to be inherited, how have gay men, who are less likely to reproduce, continued to pass on their genes without becoming extinct? According to The Gazette of Montreal, one long-running theory argues that gay men serve the evolutionary role of acting as “super uncles” who assist close relatives and indirectly increase the chances of passing on their genes. “The idea is that homosexuals are helping their close relatives reproduce more successfully and at a higher rate by being helpful: babysitting more, tutoring their nieces and nephews in art and music, and helping out financially with things like medical care and education,” reports The Gazette. Vasey and his colleague Doug VanderLaan tested the theory on the Pacific island of Samoa, where they studied women, straight men, and the fa'afafine, men who prefer other men as sexual partners and are accepted within the culture as a distinct third gender category. “Vasey found that the fa'afafine said they were significantly more willing to help kin, yet much less interested in helping children who aren't family — providing the first evidence to support the ‘kin selection hypothesis,'” reports The Gazette. "Maybe it's in this way that they're indirectly passing on at least some of the genes that they're sharing with their kin," said Vasey. The findings are published online this week in the journal Psychological Science. Researchers are now exploring whether the fa'afafine actually follow through on their stated willingness to help family members by giving more money to relatives.
fedilink


Scribe.rip Open source Medium Front end
This is a recent toot from the creator: I've made an alternative frontend to Medium: https://scribe.rip If you, like me, are occasionally forced to read articles on medium.com, now you can read them on Scribe instead! Feedback welcome. @edwardloveall@mastodon.technology
fedilink



Magnus Hirschfeld, (born May 14, 1868, Kolberg, Prussia [now Kołobrzeg, Poland]—died May 14, 1935, Nice, France), German physician who was an important theorist of sexuality and a prominent advocate of gay rights in the early 20th century. Hirschfeld was born to Jewish parents in a Prussian town on the Baltic coast. He first studied modern languages and then medicine, obtaining a doctoral degree in 1892. After a period of travel, he returned to Germany and established a medical practice in Magdeburg in 1894. Two years later he moved to Berlin, where he would become actively involved in the scientific study of sexuality—in particular, homosexuality—and advocacy efforts on behalf of sexual minorities. Hirschfeld maintained that sexual orientation was innate and not a deliberate choice, and he believed that scientific understanding of sexuality would promote tolerance of sexual minorities. His sexology research was guided by empiricism and activism, driven by the belief that the sexual ideology of Judeo-Christian civilization was a serious obstacle to the understanding of sexuality and to the reform of laws and practices that regulated it. Hirschfeld accomplished an enormous amount of work during his lifetime with regards to his research, writing, and advocacy efforts. In 1897 Hirschfeld established the Scientific-Humanitarian Committee with Max Spohr, Franz Josef von Bülow, and Eduard Oberg; it was the world’s first gay rights organization. Its main goal was to fight for the abolishment of Paragraph 175 of the German Imperial Penal Code, which punished sexual contact between men. In 1899 he started the Yearbook of Intermediate Sexual Types, the first journal in the world to deal with sexual variants; it was regularly published until 1923. He also published an important study on cross-dressing, The Transvestites (1910). Hirschfeld was one of the founders of the Medical Society for Sexual Science and Eugenics, established in 1913. The next year he published his study Homosexuality in Men and Women, which was based on the expansive statistical surveys on homosexuality that he had conducted. In addition to publishing works on sexology and sexual reforms, Hirschfeld also wrote about racism, politics, and the history of morals. In 1919 Hirschfeld opened the first sexology institute in the world, the Institute for Sexual Science, in Berlin; the institute and the considerable holdings of its library and archives were destroyed by Nazi demonstrators in 1933. Hirschfeld also participated in the production of the first film to call for the decriminalization and acceptance of homosexuality, Different from the Others (1919). The controversial film ignited much debate and was banned by German officials within a year. In 1928 Hirschfeld founded the World League for Sexual Reform (WLSR), which had its roots in an early conference that he had organized in 1921, the First International Conference for Sexual Reform on a Scientific Basis. The WSLR called for reform of sex legislations, the right to contraception and sex education, and legal and social equality of the sexes. Being a Jew, a gay man, and a sexual liberation activist made Hirschfeld the target of right-wing supporters, and he suffered serious injuries from an attack in 1920. Later, with the Nazis’ growing power, he was regularly assaulted, his lectures were disrupted, and, upon completion of his international speaking tour in 1932, he was unable to return to Germany. He instead went to Switzerland and then in 1934 to France, where he died the next year.
fedilink

Is Silence still maintained?
I wanted to install Silence, but fdroid says it was last updated 2 year ago. Is the project still alive and active? I remember using it a while ago and i really liked it. That's why i want to install it again.
fedilink






Do you think the fediverse has become a save space for radicals?
The alt social media are a good place to run away from algorithms and explicit mass manipulation from other big tech, be more free, independent, and spread ideas, but I also think that the fediverse is being filling with people with radical political ideologies, some kicked out of traditional social platforms for this same reason. I'm not saying that we can't have discussions, but I think that many people are making their own echo Chambers in the fediverse.
fedilink


“Trump Accuses Google of Burying Conservative News in Search Results,” reads an August 28 New York Times headline. The piece features a bombastic president, a string of bitter tweets, and accusations of censorship. “Algorithms” are mentioned, but not until the twelfth paragraph. Trump—like so many other politicians and pundits—has found search and social media companies to be convenient targets in the debate over free speech and censorship online. “They have it RIGGED, for me & others, so that almost all stories & news is BAD,” the president recently tweeted. He added: “They are controlling what we can & cannot see. This is a very serious situation---will be addressed!” Trump is partly right: They are controlling what we can and cannot see. But “they” aren’t the executives leading Google, Facebook, and other technology companies. “They” are the opaque, influential algorithms that determine what content billions of internet users read, watch, and share next. These algorithms are invisible, but they have an outsized impact on shaping individuals’ experience online and society at large. Indeed, YouTube’s video-recommendation algorithm inspires 700,000,000 hours of watch time per day—and can spread misinformation, disrupt elections, and incite violence. Algorithms like this need fixing. But in this moment, the conversation we should be having—how can we fix the algorithms?—is instead being co-opted and twisted by politicians and pundits howling about censorship and miscasting content moderation as the demise of free speech online. It would be good to remind them that free speech does not mean free reach. There is no right to algorithmic amplification. In fact, that’s the very problem that needs fixing. TO SEE HOW this algorithm amplification works, simply look to RT, or Russia Today, a Russian state-owned propaganda outlet that’s also among the most popular YouTube presences. RT has amassed more than 6 billion views across 22 channels, more than MSNBC and Fox News combined. According to YouTube chief product officer Neal Mohan, 70 percent of views on YouTube are from recommendations—so the site’s algorithms are largely responsible for amplifying RT’s propaganda hundreds of millions of times. How? Most RT viewers don’t set out in search of Russian propaganda. The videos that rack up the views are RT’s clickbait-y, gateway content: videos of towering tsunamis, meteors striking buildings, shark attacks, amusement park accidents, some that are years old but have comments from within an hour ago. This disaster porn is highly engaging; the videos have been viewed tens of millions of times and are likely watched until the end. As a result, YouTube’s algorithm likely believes other RT content is worth suggesting to the viewers of that content—and so, quickly, an American YouTube user looking for news finds themselves watching Russia’s take on Hillary Clinton, immigration, and current events. These videos are served up in autoplay playlists alongside content from legitimate news organizations, giving RT itself increased legitimacy by association. The social internet is mediated by algorithms: recommendation engines, search, trending, autocomplete, and other mechanisms that predict what we want to see next. The algorithms don’t understand what is propaganda and what isn’t, or what is “fake news” and what is fact-checked. Their job is to surface relevant content (relevant to the user, of course), and they do it exceedingly well. So well, in fact, that the engineers who built these algorithms are sometimes baffled: “Even the creators don’t always understand why it recommends one video instead of another,” says Guillaume Chaslot, an ex-YouTube engineer who worked on the site’s algorithm. These opaque algorithms with their singular purpose—“keep watching”—coupled with billions of users is a dangerous recipe. In recent years, we’ve seen how dire the consequences can be. Propaganda like RT content is circulated far and wide to disinform and worsen polarization, especially during democratic elections. YouTube’s algorithms can also radicalize by suggesting “white supremacist rants, Holocaust denials, and other disturbing content,” Zeynep Tufekci recently wrote in the Times. “YouTube may be one of the most powerful radicalizing instruments of the 21st century.” The problem extends beyond YouTube, though. On Google search, dangerous anti-vaccine misinformation can commandeer the top results. And on Facebook, hate speech can thrive and fuel genocide. A United Nations report about the genocide in Myanmar reads: “The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet … The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.” So what can we do about it? The solution isn’t to outlaw algorithmic ranking or make noise about legislating what results Google can return. Algorithms are an invaluable tool for making sense of the immense universe of information online. There’s an overwhelming amount of content available to fill any given person’s feed or search query; sorting and ranking is a necessity, and there has never been evidence indicating that the results display systemic partisan bias. That said, unconscious bias is a concern in any algorithm; this is why tech companies have investigated conservative claims of bias since the Facebook Trending News debacle of 2016. There hasn’t been any credible evidence. But there is a trust problem, and a lack of understanding of how rankings and feeds work, and that allows bad-faith politicking to gain traction. The best solution to that is to increase transparency and internet literacy, enabling users to have a better understanding of why they see what they see—and to build these powerful curatorial systems with a sense of responsibility for what they return. There have been positive steps in this direction. The examples of harms mentioned above have sparked congressional investigations aimed at understanding how tech platforms shape our conversations and our media consumption. In an upcoming Senate hearing next week, the Senate Intelligence Committee will ask Jack Dorsey of Twitter and Sheryl Sandberg of Facebook to provide an accounting of how, specifically, they are taking steps to address computational propaganda. It’s imperative that we focus on solutions, not politics. We need to build on those initial investigations. We need more nuanced conversations and education about algorithmic curation, its strange incentives, and its occasionally unfortunate outcomes. We need to hold tech companies accountable—for irresponsible tech, not evidence-free allegations of censorship—and demand transparency into how their algorithms and moderation policies work. By focusing on the real problem here, we can begin addressing the real issues that are disrupting the internet—and democracy.
fedilink


Moderates