Friday, October 29th 2021

Introducing Meta: A Social Technology Company

Today at Connect 2021, CEO Mark Zuckerberg introduced Meta, which brings together Facebook's apps and technologies under one new company brand. Meta's focus will be to bring the metaverse to life and help people connect, find communities and grow businesses.

The metaverse will feel like a hybrid of today's online social experiences, sometimes expanded into three dimensions or projected into the physical world. It will let you share immersive experiences with other people even when you can't be together — and do things together you couldn't do in the physical world. -It's the next evolution in a long line of social technologies, and it's ushering in a new chapter for the company. Zuckerberg shared more about this vision in a founder's letter.
The annual Connect conference brings together augmented and virtual reality developers, content creators, marketers and others to celebrate the industry's momentum and growth. This year's virtual event explored what experiences in the metaverse could feel like over the next decade - from social connection, to entertainment, gaming, fitness, work, education and commerce. The company also announced new tools to help people build for the metaverse, including Presence Platform, which will enable new mixed reality experiences on Quest 2, and a $150-million investment in immersive learning to train the next generation of creators.

You can watch the full Connect keynote and learn more about how the metaverse will unlock new opportunities at meta.com. You can also learn more about the company's work over the past several months to develop the Meta brand on its design blog.

Facebook's corporate structure is not changing as Meta. However, how it reports on financials will. Starting with results for the fourth quarter of 2021, the company plans to report on two operating segments: Family of Apps and Reality Labs. Meta also intends to start trading under the new stock ticker that the company reserved, MVRS, on December 1. Today's announcement does not affect how the company uses or shares data.

About Meta
Meta builds technologies that help people connect, find communities, and grow businesses. When Facebook launched in 2004, it changed the way people connect. Apps like Messenger, Instagram and WhatsApp further empowered billions around the world. Now, Meta is moving beyond 2D screens toward immersive experiences like augmented and virtual reality to help build the next evolution in social technology.
Source: Meta
Add your own comment

108 Comments on Introducing Meta: A Social Technology Company

#76
windwhirl
RavenasThrowing out algorithms, I really don't understand the difference between Facebook and this forum. I don't mean that in a negative way. Topics are posted to Techpowerup, just like they are on Facebook. People comment. It's a people problem not a platform problem. Society has degraded to become nearly 100% uncensored and unfiltered. Misinformation isn't specific to Facebook, it's everywhere.

Scale Techpowerup to billions of active users worldwide, and there is no difference.
For one, there are active moderators in a more or less adequate scale to the forums activity.
Posted on Reply
#77
Ravenas
ValantarSure there is. Without algorithms to sort and promote content, things would fall to the wayside very, very quickly. The reinforcement loop effects we see with most large scale algorithmic social media would be far, far weaker, simply because most content would just disappear before reaching most people.

You're right inasmuch as this can't solely be blamed on algorithms, but I don't think anyone here is doing that. What is being said is that due to the algorithms in place working as they do, being designed in the way they are, this serves to afford a specific set of interactions and behaviours. If people didn't in some way like these behaviours, this would likely not be a problem. But similarly, if these behaviours were not being afforded, or even promoted, by the algorithms, then people would most likely not take part in them to nearly the same degree.

I mean, the field of media effects and media reception has been studied for the better part of a century, and the main findings from this is that there is no such thing as a simple answer. Rather, how people relate to each other through social media, and how they relate to social media itself, is intrinsically bound up in the dynamics of these interactions, the affordances of the systems, the relations in play, the specific forms of mediation involved and how they affect how we view and relate to other people, and a whole host of other factors. Algorithms are just one piece of this puzzle. But its a major piece, one that ties together many if not most of the others, one that has a strong deterministic effect on the specific actions encouraged and afforded by the others, and crucially, is for the most part a black box, created with goals that are not necessarily aligned with (and potentially in direct conflict with) the goals of users. A large-scale system of communication that systemically privileges emotionally laden communication will over time encourage more emotionally laden posting and more emotional responses, as that is the behaviour afforded by the system, and other behaviours are similarly discouraged by the design of the system. This isn't a 1:1 relation in any way, but there is a clear direction to it still.
I've been a Facebook member since 2007 (and I'm not defending Facebook by any means). I've never engaged in highlighted problems with Facebook, despite algorithm changes over a decade.

It's a people problem. Society has degraded. Social norms and boundaries are collapsing. This is from the top down, leadership in government to people on Facebook.
windwhirlFor one, there are active moderators in a more or less adequate scale to the forums activity.
Rule breaking exists despite moderation.
Posted on Reply
#78
bug
RavenasThrowing out algorithms, I really don't understand the difference between Facebook and this forum. I don't mean that in a negative way. Topics are posted to Techpowerup, just like they are on Facebook. People comment. It's a people problem not a platform problem. Society has degraded to become nearly 100% uncensored and unfiltered. Misinformation isn't specific to Facebook, it's everywhere.

Scale Techpowerup to billions of active users worldwide, and there is no difference.
The difference (with or without algorithms) is the automated sharing. On TPU I may get a heads-up about some user I'm following, but nothing's going to fill my forum page of kittens and whatnot.
I.e. social media actively pushes content, while also actively maintains it's not responsible for the content it pushes. That was fine in the early days, but today, with countless instances of damages enabled by social media, it cannot allowed to continue without tighter regulation.
Posted on Reply
#79
Ravenas
bugThe difference (with or without algorithms) is the automated sharing. On TPU I may get a heads-up about some user I'm following, but nothing's going to fill my forum page of kittens and whatnot.
I.e. social media actively pushes content, while also actively maintains it's not responsible for the content it pushes. That was fine in the early days, but today, with countless instances of damages enabled by social media, it cannot allowed to continue without tighter regulation.
People have choice to respond to a post they are provided. No one forces you to type a comment. They make money off advertising driven by engagement probability.

The only root problem I have with them is data mining of their users.

For example, TechPowerUp uses google analytics to track website activity. This is very extremely minimal in comparison to sites like TomsHardware who is literally using 49 known trackers.

ssc-cms.33across.com
eb2.3lift.com
s2.adform.net
sync.adkernel.com
ib.adnxs.com
cdn.adsafeprotected.com
match.adsrvr.org
pixel.advertising.com
c.amazon-adsystem.com
s.amazon-adsystem.com
sync.bfmio.com
x.bidswitch.net
stags.bluekai.com
bttrack.com
ssum-sec.casalemedia.com
bh.contextweb.com
p.cpx.to
cdn.districtm.io
uk-script.dotmetrics.net
purch-match.dotomi.com
ad.doubleclick.net
securepubads.g.doubleclick.net
cs.emxdgt.com
ps.eyeota.net
www.facebook.com
www.google-analytics.com
adservice.google.com
g2.gumgum.com
b-code.liadm.com
ap.lijit.com
cookie-matching.mediarithmics.com
ml314.com
cdn.onesignal.com
us-u.openx.net
cdn.parsely.com
6093eccf-6734-4877-ac8b-83d6d0e27b46.edge.permutive.app
ads.pubmatic.com
id.sv.rkdms.com
api.rlcdn.com
ats.rlcdn.com
secure-assets.rubiconproject.com
sb.scorecardresearch.com
ads.servebom.com
offer.slgnt.eu
purch-sync.go.sonobi.com
sync.go.sonobi.com
qds0l.publishers.tremorhub.com
ups.analytics.yahoo.com
content.zeotap.com
Disable for this site Report broken site



Beyond that data mining, you are asking Facebook to fix society problems which is impossible. It existed before and will exist after. No amount of government regulation will solve that unless we go Chinese communist, and just shut down all social media platforms and then further stop any all dissent in society.
Posted on Reply
#80
AsRock
TPU addict
TheLostSwedeI'm not saying it's aliens, but it's aliens...

Dude, the infinity symbol has been around for ever, they just twisted it a little haha.
Posted on Reply
#81
bug
RavenasPeople have choice to respond to a post they are provided. No one forces you to type a comment. They make money off advertising driven by engagement probability.
That's assuming the target audience is responsible adults. Which we all know it's not.
RavenasBeyond that data mining, you are asking Facebook to fix society problems which is impossible. It existed before and will exist after. No amount of government regulation will solve that unless we go Chinese communist, and just shut down all social media platforms and then further stop any all dissent in society.
I know, I have already stated it's a tough nut to crack - it's every bit as hard as gun control. But they can certainly do better than they do.
Posted on Reply
#82
Valantar
bugThat's assuming the target audience is responsible adults. Which we all know it's not.
No, it just assumes that humans are, or can be, (mainly/only) rational actors, which is perhaps the single most damaging fallacy promoted by Western enlightenment thinking. (Though the mind-body split is also up there, for sure.)
Posted on Reply
#83
Why_Me
bugThat's assuming the target audience is responsible adults. Which we all know it's not.

I know, I have already stated it's a tough nut to crack - it's every bit as hard as gun control. But they can certainly do better than they do.
As far as the US goes, I'd be more worried about the lies and propaganda the MSN pushes than Facebook algorithms. With gun control, it's the cities with the most stringent gun laws that have the most shootings.
Posted on Reply
#84
Valantar
Why_MeAs far as the US goes, I'd be more worried about the lies and propaganda the MSN pushes than Facebook algorithms. With gun control, it's the cities with the most stringent gun laws that have the most shootings.
Not really relevant to the topic, but have you considered that the causality might be the other way around? I.e. that they have the strictest gun laws in order to try and reduce an already high level of gun violence? I'm not saying there's a causal relation either way, but you clearly are without backing that up with anything more than a loaded suggestion.
Posted on Reply
#85
RandallFlagg
ValantarNo, it just assumes that humans are, or can be, (mainly/only) rational actors, which is perhaps the single most damaging fallacy promoted by Western enlightenment thinking. (Though the mind-body split is also up there, for sure.)
Individuals have the ability to turn off things they do not want to see and encourage things they do want to see. One person may choose to turn off CNN, another to turn off Fox News. If you run around looking for Russian conspiracies then you'll see that, if you look for Q-Anon then it'll show you more. If you want to know how useless masks are you'll see that, if you want to know how great they are that is what you'll see.

What this really amounts to is the rather obvious - people want to control what others see and read in order to foist their agendas on others. Since social media is a 'push' model unlike say a library which is clearly a 'pull' model, it's an easy target. You are not getting educated on social media, you are being indoctrinated. This is where most people will have a cow, because they think they've learned so much on Twitter and Facebook.

Personally I think social media platforms should be redefined and greatly shrunk by removing their section 230 protection entirely.
Posted on Reply
#86
Why_Me
ValantarNot really relevant to the topic, but have you considered that the causality might be the other way around? I.e. that they have the strictest gun laws in order to try and reduce an already high level of gun violence? I'm not saying there's a causal relation either way, but you clearly are without backing that up with anything more than a loaded suggestion.
What loaded suggestion? St. Louis, Chicago, DC, Detroit, and Baltimore just to name a few have the strictest gun laws in the US and also the most shootings per capita.
Posted on Reply
#87
Valantar
Why_MeWhat loaded suggestion? St. Louis, Chicago, DC, Detroit, and Baltimore just to name a few have the strictest gun laws in the US and also the most shootings per capita.
That might be (I frankly have no idea about the facts here), but why is it being brought up in this context? What are you pointing at by saying this in this discussion? It certainly seems like you're implying a causal relationship, but if so that just warrants a reminder that correlation does not equal causation. If not, then I don't understand your response, or why you brought it up at all (though tbh I struggle with the latter regardless of the intent, as its getting very very OT at this point).
Posted on Reply
#88
Why_Me
ValantarThat might be (I frankly have no idea about the facts here), but why is it being brought up in this context? What are you pointing at by saying this in this discussion? It certainly seems like you're implying a causal relationship, but if so that just warrants a reminder that correlation does not equal causation. If not, then I don't understand your response, or why you brought it up at all (though tbh I struggle with the latter regardless of the intent, as its getting very very OT at this point).
www.techpowerup.com/forums/threads/introducing-meta-a-social-technology-company.288435/post-4640454

Reading the post I quoted and then your reply afterwards isn't difficult.
Posted on Reply
#89
TheLostSwede
News Editor
AsRockDude, the infinity symbol has been around for ever, they just twisted it a little haha.
It just happened to coincide with a show on Netflix that used that design for their alien antagonist, so just a bit of fun.
Posted on Reply
#90
Valantar
Why_Mewww.techpowerup.com/forums/threads/introducing-meta-a-social-technology-company.288435/post-4640454

Reading the post I quoted and then your reply afterwards isn't difficult.
Hm? I don't see your point. You said "I'd be more concerned about propaganda", and then brought up a more or less random point about a seeming correlation between gun violence and legislation. This strongly implies that you think this relates to said "propaganda", but leaves it at the level of a loaded suggestion as you neither argue for it nor present any data or analysis supporting there being an actual causal relation. Hence me asking about the directionality of this relation and whether there can be said to be any causal link at all. Is this difficult to grasp?

The post you quoted just said "this is a difficult question, just like gun control". You took that statement and... went to "the places with the strictest gun control laws have the most gun violence", as if that statement somehow addresses the complexities of gun control.
RandallFlaggIndividuals have the ability to turn off things they do not want to see and encourage things they do want to see. One person may choose to turn off CNN, another to turn off Fox News. If you run around looking for Russian conspiracies then you'll see that, if you look for Q-Anon then it'll show you more. If you want to know how useless masks are you'll see that, if you want to know how great they are that is what you'll see.
Again, this is a bit too simplistic. It's not wrong, but using the word "choose" without any further reservations makes it sound as if this is an entirely voluntary and rational process, which it isn't - and crucially, can never realistically be. Relating consciously and rationally to everything in your life is simply not possible - even attempting such a thing is a recipe for exhaustion and burnout. Heck, there is plenty of research on the severe detrimental effects of too much information and having too many choices to make. Our brains don't work that way - relying on habit, automated responses, or just not thinking things through all that much is a survival mechanism. This of course has several issues, from those habits and automated responses being flawed in various ways (in no small part thanks to habits often forming from these non-rational, to opening one up to manipulation.

The issue arises when so much of our lives are mediated through essentially free-for-all platforms, where anyone with the right degree of savvy and lack of scruples have the means to spread manipulative lies to millions of people. And the issue is exacerbated when these platforms are built around "engagement", and see all engagement as good engagement, meaning that anything shocking or provocative gets promoted and shared more, gets featured more prominently in people's feeds, etc. And on top of that there are the algorithms for feeding you new things, which feed off previous engagement, which always leads to escalation in degree and form. There's plenty of excellent reporting done on the shockingly short path from watching relatively moderate conservative videos on YouTube to being served bonkers Pizzagate-style conspiracy theories - and Facebook does the same, just slightly more subtly and more through paid ads.
RandallFlaggWhat this really amounts to is the rather obvious - people want to control what others see and read in order to foist their agendas on others. Since social media is a 'push' model unlike say a library which is clearly a 'pull' model, it's an easy target. You are not getting educated on social media, you are being indoctrinated. This is where most people will have a cow, because they think they've learned so much on Twitter and Facebook.
This on the other hand I completely agree with. Social media gained popularity through ease of use and new modes of communication, but brought with them a problematic blurring of the lines between social and parasocial relationships as well as various never before seen modes of one-to-many multidirectional communication (or: shouting into the void, hoping for a response). On top of that, as these companies were hemorrhaging money (and the creators likely took a lot of voyeuristic pleasure in their access to various data), they started mixing ands and promotions into this - which quickly become a really, really problematic mixture. Rather than realizing the initial utopian dream of free and open communication we instead have a system dominated by (hidden) money and power, just in slightly new forms (often modelled after tax evasion and other neoliberal schemes, with shell accounts and astroturfing), but with all the more reach and impact as people haven't learned (or have forgotten) how to differentiate between trustworthy sources, actual social relationships, and all the cruft.

Btw, a really interesting article I read the other day by one of the more interesting media researchers in recent decades on this exact thing: People aren't meant to talk this much.
Posted on Reply
#91
Ravenas
ValantarAgain, this is a bit too simplistic. It's not wrong, but using the word "choose" without any further reservations makes it sound as if this is an entirely voluntary and rational process, which it isn't - and crucially, can never realistically be. Relating consciously and rationally to everything in your life is simply not possible - even attempting such a thing is a recipe for exhaustion and burnout. Heck, there is plenty of research on the severe detrimental effects of too much information and having too many choices to make. Our brains don't work that way - relying on habit, automated responses, or just not thinking things through all that much is a survival mechanism. This of course has several issues, from those habits and automated responses being flawed in various ways (in no small part thanks to habits often forming from these non-rational, to opening one up to manipulation.
Whether you think people have the ability to make choices or not, it is wrong to not give a choice. Facebook contains no addictive compounds which force the brain to do anything. Your implication that people can't make choices is just wrong.

For example, you made a choice to comment repeatedly on this topic. No one forced you to, but you did. You made a choice. You could have also made the choice not to. There is no difference between a forum like TechPowerUp and Facebook in regards to content and making choices. Facebook may have more advanced algorithms and larger userbase to target people based on the data they have collected, but that is only to provide better service for advertisers and to also group large quantities of data for certain audiences who may find it news worthy.

Regulation is not the answer. Better society and better leadership is. Politicians and society blame Facebook for their own problems.
Posted on Reply
#92
Valantar
RavenasWhether you think people have the ability to make choices or not, it is wrong to not give a choice. Facebook contains no addictive compounds which force the brain to do anything. Your implication that people can't make choices is just wrong.

For example, you made a choice to comment repeatedly on this topic. No one forced you to, but you did. You made a choice. You could have also made the choice not to. There is no difference between a forum like TechPowerUp and Facebook in regards to content and making choices. Facebook may have more advanced algorithms and larger userbase to target people based on the data they have collected, but that is only to provide better service for advertisers and to also group large quantities of data for certain audiences who may find it news worthy.

Regulation is not the answer. Better society and better leadership is. Politicians and society blame Facebook for their own problems.
...sigh. Please at least make an attempt at reading carefully? I never said anything even remotely close to an "implication that people can't make choices". I said that the choices people make are never purely rational, often not even conscious, and live in an extremely complex and muddled area between choice and habit, and that thus using the world "choose" without any reservations brings with it an unsustainable assumption of conscious rationality.

Also: post-hoc framing is not the same as situated reality. Yes, I made a choice to comment on this topic. Does that mean that I necessarily at that point thought "I could not, but I'm choosing to respond to this"? No. I might have, or I might not. I might have felt compelled by something said by someone else in an intellectual way, an emotional way, or (far more likely) some mixture of both. There are many, many ways of framing the re-telling of such events after the fact, each of which will have different implications for how the events are understood and interpreted. Again, as the world "choice" has strong ties to the idea of the rational enlightened subject making conscious actions towards the world, describing events this way carries with it a mode of interpretation that is quite problematic and certainly cannot be said to be universally applicable. Not everything that can be described as "a choice" can be reasonably understood as such in the context in which it happened without a lot of context.

Also, you're entirely ignoring the structural power of determining which choices are available in the first place, something that social media algorithms have almost absolute power over. And this is where the complex tie-ins with other fundamental human behaviours and needs come into play - if you want to keep up with your family and friends (for example), you also necessarily open yourself up to being affected by what these algorithms choose to serve you. There are many, many, many different ways of resisting this, but (again, crucially), such resistance necessitates a conscious effort, and is incredibly draining over time. Sustaining such an effort is nearly impossible in the context of a normal human life - there are typically more pressing things to spend your mental energy on (or you just might not be sufficiently aware of the myriad ways in which you are being pushed and pulled, and thus not have the vocabulary to formulate strategies of resistance).

I'm not necessarily saying that regulation is the answer - I don't have any suggestions as to how to effectively regulate social media without that effectively becoming censorship. But it's pretty clear that the current predatory capitalist free-for-all is deeply harmful on many levels. Whether this is because it has exacerbated already existing tendencies and developments in culture and society, or whether it has in some form caused those developments itself? That's a kind of black-and-white, nuance and detail erasing chicken and egg question that is at best a distraction, as it asks us to question root causes and only try to fix those when we can identify them (which we will never be able to do, as there are as many root causes as there are people using these media, if not more), rather than try to counteract the well documented harmful effects of the current mechanisms of these media.
Posted on Reply
#93
Vayra86
RavenasThrowing out algorithms, I really don't understand the difference between Facebook and this forum. I don't mean that in a negative way. Topics are posted to Techpowerup, just like they are on Facebook. People comment. It's a people problem not a platform problem. Society has degraded to become nearly 100% uncensored and unfiltered. Misinformation isn't specific to Facebook, it's everywhere.

Scale Techpowerup to billions of active users worldwide, and there is no difference.
Read up on the papers that are released on Facebook and their business model. Algorithms are directing what posts are promoted and more visible, more prominent and more prone to appear on timelines of users. The 'weight' of a like is subject of manipulation, and depends on what kind of post its under, and partially directs how the algorithm ranks visibility. On TPU, there is a 'fixed system' where the date/time of the latest post makes the topic appear on top of a subsection. You, the user, is in control of what appears as prominent, and the way this data is being used is transparent. We can see what's happening.

Facebook is a black box and that box has been opened, and what we can see is mind blowing, free from all ethics and only built for personal (zuckerberg) gain at the expense of everything and everyone else. Literally. Aggression and failure is promoted because it generates clicks. Positive posts are made less prominent because it doesn't quite generate the same ad revenue.

It could not be more different. We could spend a day worldwide filling Facebook with love and flowers, and we'd still get served more misery on top of timelines.

In similar ways many algorithms have shown to promote tunnel vision, they create a self-reinforcing and self-confirming effect, ergo tunnel vision and a distorted window on reality.
Posted on Reply
#94
Ravenas
Vayra86Read up on the papers that are released on Facebook and their business model. Algorithms are directing what posts are promoted and more visible, more prominent and more prone to appear on timelines of users. The 'weight' of a like is subject of manipulation, and depends on what kind of post its under, and partially directs how the algorithm ranks visibility. On TPU, there is a 'fixed system' where the date/time of the latest post makes the topic appear on top of a subsection. You, the user, is in control of what appears as prominent, and the way this data is being used is transparent. We can see what's happening.

Facebook is a black box and that box has been opened, and what we can see is mind blowing, free from all ethics and only built for personal (zuckerberg) gain at the expense of everything and everyone else. Literally. Aggression and failure is promoted because it generates clicks. Positive posts are made less prominent because it doesn't quite generate the same ad revenue.

It could not be more different. We could spend a day worldwide filling Facebook with love and flowers, and we'd still get served more misery.
I've already stated at the beginning, algorithms aside there is no difference. Add in algorithms, and its only provides you more prominent posting of things you were already shown to be interested in.

Prominent posts aren't generating revenue, advertisement you see on Facebook is generating revenue and selling your data. The only remotely shady aspect is that your data is being mined and sold, and ads are being targeted based off your data.

Prominent posts causing discussion is only creating reaction, and no revenue.
Posted on Reply
#95
Vayra86
RavenasI've already stated at the beginning, algorithms aside there is no difference. Add in algorithms, and its only provides you more prominent posting of things you were already shown to be interested in.

Prominent posts aren't generating revenue, advertisement you see on Facebook is generating revenue and selling your data. The only remotely shady aspect is that your data is being mined and sold, and ads are being targeted based off your data.

Prominent posts causing discussion is only creating reaction, and no revenue.
And I'm saying your guess is wrong, read the papers.
Posted on Reply
#96
Ravenas
Vayra86And I'm saying your guess is wrong, read the papers.
You’ve bought in to a narrative to draw your conclusion.

Your argument is no different than saying we should ban CNN, Fox News, MSNBC, and other news outlets for promoting stories that generate attention more, in a negative manner towards one side or the other.
Posted on Reply
#97
bug
RavenasI've already stated at the beginning, algorithms aside there is no difference. Add in algorithms, and its only provides you more prominent posting of things you were already shown to be interested in.

Prominent posts aren't generating revenue, advertisement you see on Facebook is generating revenue and selling your data. The only remotely shady aspect is that your data is being mined and sold, and ads are being targeted based off your data.

Prominent posts causing discussion is only creating reaction, and no revenue.
This is where you're wrong. Without the prominent posts, there would be no page to display the ads on, prominent posts do generate revenue albeit slightly indirectly.

Imagine instead of Facebook were talking about UPS instead. If UPS made a business model of delivering what people want, would it be ok if they delivered drugs, gins and explosives, shifting all responsibility to the customer?
Posted on Reply
#98
Ravenas
bugThis is where you're wrong. Without the prominent posts, there would be no page to display the ads on, prominent posts do generate revenue albeit slightly indirectly.

Imagine instead of Facebook were talking about UPS instead. If UPS made a business model of delivering what people want, would it be ok if they delivered drugs, gins and explosives, shifting all responsibility to the customer?
Posts generate engagement.
User profiles and tracking of likes / dislikes etc. generate data.
Both in combination provide an outlet for targeted advertisement.

I’ve never stated otherwise.
Posted on Reply
#99
Vayra86
RavenasPosts generate engagement.
User profiles and tracking of likes / dislikes etc. generate data.
Both in combination provide an outlet for targeted advertisement.

I’ve never stated otherwise.
Right, but it matters how that data is being used. Is it used 'fairly' - as in, what you would or could reasonably expect given the popularity of certain subjects (a form of data transparency, 'being honest') or is it being used in a manipulative way.

The papers reveal the manipulation at work is staggering and extends beyond Facebook as well, with full intent and knowledge that destabilization generates more money. Conflict generates money. We can see the results in societies. They're not positive. Facebook has lied on multiple counts about what they said they would do, and what they really did wrt changing the algorithms and prominence of posts.

The question that we need to ask here is whether we want that.
Posted on Reply
#100
bug
Vayra86Right, but it matters how that data is being used. Is it used 'fairly' - as in, what you would or could reasonably expect given the popularity of certain subjects (a form of data transparency, 'being honest') or is it being used in a manipulative way.

The papers reveal the manipulation at work is staggering and extends beyond Facebook as well, with full intent and knowledge that destabilization generates more money. Conflict generates money. We can see the results in societies. They're not positive. Facebook has lied on multiple counts about what they said they would do, and what they really did wrt changing the algorithms and prominence of posts.

The question that we need to ask here is whether we want that.
And we haven't said anything about organized groups spreading lies, yet.

Once again, I'm all for FB (even though my only engagement with them is WhatsApp). I understand they're in a delicate position - I've seen them being accused of closing legit accounts that were being wrongly reported. It's also unclear whether it's possible to sift through content generated by millions of people without hiring millions of people yourself. But at the same time, it's clear they are not overly concerned with curbing nefarious practices as long as they also rake in the cash. That's why I think some form of regulation is required.
Posted on Reply
Add your own comment
Apr 24th, 2024 22:33 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts