- Joined
- Apr 29, 2011
- Messages
- 31,284
- Trophies
- 2
- Age
- 38
- Location
- Dr. Wahwee's castle
- XP
- 18,969
- Country
It might end up like the copyright system. I've got a bad-ish feeling.
The algorithms they use suck ass.
It might end up like the copyright system. I've got a bad-ish feeling.
The facts are on my side, sorry. It's not like this stuff can't be tracked, just not all of it can be tracked.There it is again. Russophobia has been normalized, that's for sure.
I don't see the irony in that, no. The president isn't a king, so if Twitter sets the precedent that they aren't going to charge him with any crimes for making online threats, then that precedent applies to everyone else using Twitter within the US too.I'm certainly no fan of trump but can you see the irony in equating threats of violence from the person being mandated to run the organisation that has the legal monopoly on voilence and regular citizens that don't and on top of that call me out on whataboutism?
I fail to see how that makes it any more relevant than it was previously. As I said, TV news has a whole different set of regulations they have to adhere to. Internet news is almost entirely without regulation of any sort still.TV shows are regularly uploaded to YouTube by stations themselves. I fail not to see its relevancy.
The source is whoever/whatever gave the news outlet information enough to base a story on. Not the person who posted it.YouTube and Facebook are not sources, the people posting on these platforms are the source. I understand that algorithms are responsible for serving up recommendations and that accellerates the spreading of fake news. This, however, would be a discussion about how recommendations should be tweaked.
The thing is that they haven't announced any plans to target 'political' videos. It's limited in scope to conspiracy theories and hoaxes, because those have been some of the most damaging videos lately. They're also the easiest videos to disprove with maybe five minutes of research.I agree with the premise of such an implementation. But as I've outlined, political issues aren't as black and white as a substance never being found in pills. It's fair to have a discussion to what extent these algorithms should be applied and how it determines something as factual. Obviously in case of the Covington kids fake news have been spread in TV which was in turn uploaded to YouTube (even days after it was debunked), in papers and on different news sites. How would a machine learn that information spread by the biggest news outlets in the world is non-factual? If it's humans applying these warnings manually, how do you ensure they're correct? You'd essentially have to hold them to a higher standard than journalists. Crowdsource it? Wikipedia has conflicting information! Again, I'm not trying to call out hyprocrisy, I'm trying to discuss challenges.
I don't see the irony in that, no. The president isn't a king, so if Twitter sets the precedent that they aren't going to charge him with any crimes for making online threats, then that precedent applies to everyone else using Twitter within the US too.
I fail to see how that makes it any more relevant than it was previously. As I said, TV news has a whole different set of regulations they have to adhere to. Internet news is almost entirely without regulation of any sort still.
The source is whoever/whatever gave the news outlet information enough to base a story on. Not the person who posted it.
The thing is that they haven't announced any plans to target 'political' videos. It's limited in scope to conspiracy theories and hoaxes, because those have been some of the most damaging videos lately. They're also the easiest videos to disprove with maybe five minutes of research.
What concerns me about Youtube, perhaps even more than the spread of disinformation itself, is the algorithms by which people are recommended these disinformation videos/videos by hate groups. Kids shouldn't be getting recommended neo-nazi vids after watching PewDiePie play Banjo-Kazooie or some shit.
Well, there was the time he threatened nuclear war with Iran on Twitter, the multiple times he threatened investigations/jail against his political opponents, and the time he posted a wrestling gif of him hitting CNN in the back of the head with a chair. So everything up to and including threats of physical violence. I can post each of these individually if you'd like, but they're pretty well-known Tweets by now.Like I said I'm unaware of the threats, doing a quick google search all I could find was threats of military action, which is the irony I was getting at. I'd appreciate some example if you have some by hand.
Which is not at all related to fact-checking conspiracy and hoax videos. TV news issues its own retractions when they get their info wrong.We're discussing algorithms on YouTube, these shows are posted to YouTube and would be subject to the algorithm, if you can't make that connection then we're just wasting time.
They use Facebook and YouTube as sources for news gathering. I was never suggesting that Facebook and YouTube publish their own news.You're correct, I hope you see now how idiotic it was now to say people use YouTube and Facebook as sources.
Correct. It also says the feature will be rolled out worldwide at a later date.From what I understand a conflict between Pakistanis and Indians is a political issue, your article says the algorithm will be rolled out to warn of misinformation there. It's literally in the article you posted.
I didn't comment on which side of the political spectrum has more videos available on Youtube, just on how the recommendation algorithm is broken when it's recommending extremist political videos (of any kind) after watching a gaming video.According to this http://pyt.azureedge.net/ from this software engineer https://twitter.com/mark_ledwich left wing politics is dominating YouTube and receiving much more recommendations. I'd be more worried about kids being indoctrinated with communist propaganda than by Nazis. (Hint: this is whataboutism, but it really does no one a favor to look at only one side of the issue)
Well, there was the time he threatened nuclear war with Iran on Twitter, the multiple times he threatened investigations/jail against his political opponents, and the time he posted a wrestling gif of him hitting CNN in the back of the head with a chair. So everything up to and including threats of physical violence. I can post each of these individually if you'd like, but they're pretty well-known Tweets by now.
Which is not at all related to fact-checking conspiracy and hoax videos. TV news issues its own retractions when they get their info wrong.
They use Facebook and YouTube as sources for news gathering. I was never suggesting that Facebook and YouTube publish their own news.
Correct. It also says the feature will be rolled out worldwide at a later date.
I didn't comment on which side of the political spectrum has more videos available on Youtube, just on how the recommendation algorithm is broken when it's recommending extremist political videos (of any kind) after watching a gaming video.
Some quotes were wrongly attributed to the boys, but for the most part all TV news did was run the recorded footage.Pretty much all MSM reporting on the Covington kids case was a hoax, no retractions have been issued that I'm aware of.
I would strongly disagree. Youtube and Facebook have no method of filtering out tabloid garbage, and they'll often give it top billing just as much as they do with real news. Plus you're at the mercy of your friend groups and liked videos in terms of what will be presented to you. At best you'll get 1/4th of the complete picture when using either of these sites for news gathering.I guess we have a misunderstanding. The point is there's nothing inherently wrong with YouTube or Facebook as a source for news gathering.
Huh? I don't remember agreeing to that. The focus is on conspiracy and hoax videos entirely until they announce otherwise. And while some of those conspiracy videos might be somewhat political in nature, they're still the easiest videos to disprove.So now that we've agreed it will be applied to political issues the point still stands that things aren't as black and white in that area. I will wait and see what the implementation looks like and the extent of its application.
Some quotes were wrongly attributed to the boys, but for the most part all TV news did was run the recorded footage.
I would strongly disagree. Youtube and Facebook have no method of filtering out tabloid garbage, and they'll often give it top billing just as much as they do with real news. Plus you're at the mercy of your friend groups and liked videos in terms of what will be presented to you. At best you'll get 1/4th of the complete picture when using either of these sites for news gathering.
Huh? I don't remember agreeing to that. The focus is on conspiracy and hoax videos entirely until they announce otherwise. And while some of those conspiracy videos might be somewhat political in nature, they're still the easiest videos to disprove.
The problem is how Youtube chooses to curate its content, or rather its complete lack of curation. You can have two videos listed right next to each other, one fake news and one real news, but without doing some research outside of Youtube, there's no way to tell them apart in terms of credibility. Your average user definitely isn't going to bother.TBH I can't really speak for Facebook as I don't use it but with regards to YouTube there's literally no difference if you watch a clip on TV or watch the same clip on YouTube. In the end it comes down to whose content you watch. I believe we're really arguing about the temptation of recommended videos of unreliable outlets / content creators?
I was agreeing to the localized rollout, not that the Pakistani-Indian issue was a political one. That was an issue of false pedophilia/kidnapping allegations which led to lynchings. Not very political at all in its nature.I guess you should pay attention to what you quote and how you respond.
The problem is how Youtube chooses to curate its content, or rather its complete lack of curation. You can have two videos listed right next to each other, one fake news and one real news, but without doing some research outside of Youtube, there's no way to tell them apart in terms of credibility. Your average user definitely isn't going to bother.
I was agreeing to the localized rollout, not that the Pakistani-Indian issue was a political one. That was an issue of false pedophilia/kidnapping allegations which led to lynchings. Not very political at all in its nature.
I wish it was only my country, but the truth is that a majority of the world is used to instant gratification now thanks to social media. Thus it's not only the US that's open to floods of disinformation. For example, if anybody here believes that the Brexit campaign/vote was entirely organic and not influenced by foreign countries, I've got some oceanfront property in Colorado to sell you.Hmm, you scare the shit out of me.
If an average user will not bother to check whether a story is factual you have much bigger problems in your country than social media algorithms.
Chances are incredibly high that his post was just another neo-Nazi dog whistle from a neo-Nazi member.Wait, who's organizing to resist Google? They haven't always made the best decisions, but I'd still prefer if they were in charge of government right now as opposed to who we actually have.
I know you are trying to be sneaky with your dog whistles, but you aren’t very good at it.The same thing and "people" responsible for everything we are currently organizing to resist. Okay!
Oh, appreciate the heads up. In that case, he can try his hardest, but we all know what the number one political ideology is, now and forever:Chances are incredibly high that his post was just another neo-Nazi dog whistle from a neo-Nazi member.
Oh, appreciate the heads up. In that case, he can try his hardest, but we all know what the number one political ideology is, now and forever:
I find your constant decrying of the evils of censorship then being followed by such comments to be very odd.Many videos on YouTube just rub me the wrong way. I wished they were permanently erased from their servers.
They got a cool 250mil from a settlement from WP. One of the editors of CNN violated the confidentiality agreement and now they're going to sue them again. lolThe Covington case was not a small issue. They were being harassed because of the fake news that they had to shut down their entire school for their safety because there was threats of violence. It was not a small issue at all.
If a person says go beat up this person on a video then yes shut them down.
But I don’t want them to shut down Hoaxers. People should be free to talk about flat earth all they want.
They are going to police information as if they are the authority. They are essentially giving the top spot to who they think is better. But Politifact isn’t always right, or Quint (example used in the article). The way the fact checker is setup, it can perpetuate fake news rather then solve the issue. And they are shutting down, deplatforming alternative news and their competition.
Imagine this system with the covington case. YouTube actually got this right, and the alternative news is how people learned the truth.
Wow old post you replied to. Good, that's what WP gets for knowingly putting out false information to mislead to push their agenda.They got a cool 250mil from a settlement from WP. One of the editors of CNN violated the confidentiality agreement and now they're going to sue them again. lol