報道: Threadsの月間アクティブユーザー1.3億人到達

centralized/Meta/Threads

前回: Threadsのスマホ壁紙の配布 | GNU social JP Web取材: Metaのプロダクトマネージャーが明かすThreadsの一般ユーザーのActivityPub対応予定は最短2-3か月以内 | GNU social JP Web

Meta CEOのMark ZuckerbergがThreadsの月間アクティブユーザーが1.3億人に達したと発表していたので紹介します (Threads now reaches more than 130 million monthly users, says Meta, up 30M from Q3 | TechCrunch)。

2024-02-01に公開された2023年Q4の発表「Meta – Q4 2023 Earnings Call」で明かされました。9月末のQ3の1.2億人から3か月で1000万人の増加です。

Post by @aiatmeta
2024-02-01T23:12:08.000Z
Today, @zuck outlined Meta’s vision for building the world’s most advanced AI products and services — and how our teams are positioned to deliver on this work. You can read more in his post 👉 bit.ly/3SJ2m…With our investments in world-class compute infrastructure, our continued investment in open source and hundreds of millions of people interacting with AI services across our products, we have a unique opportunity to shape the future of AI.
View on Threads
Post by @mosseri
2024-02-01T23:36:13.000Z
@zuck announced today at Meta’s earnings that Threads is growing steadily with more than 130M monthly actives. A big thank you to the team, but also to our growing community here – looking forward to improving on the experience for you all and joining more conversations.
View on Threads

I just shared Meta’s Q4 results. We estimate there are more than 3.1 billion people using at least one of our apps daily. I’m excited about the work we’re doing to accomplish our long-term vision for AI and the metaverse, while also making our apps and hardware really great. Here’s the transcript of what I said on our earnings call: This was a good quarter and it wrapped up an important year for our community and our company. We estimate that there are more than 3.1 billion people who use at least one of our apps each day. 2023 was our “year of efficiency” which focused on making Meta a stronger technology company and improving our business to give us the stability to deliver our ambitious long-term vision for AI and the metaverse. Last year, not only did we achieve our efficiency goals, but we returned to strong revenue growth, saw strong engagement across our apps, shipped a number of exciting new products like Threads, Ray Ban Meta smart glasses, and mixed reality in Quest 3, and of course established a world-class AI effort that is going to be the foundation for many of our future products. I think that being a leaner company is helping us execute better and faster, and we will continue to carry these values forward as a permanent part of how we operate. Now, moving forward, a major goal will be building the most popular and most advanced AI products and services. If we succeed, everyone who uses our services will have a world-class AI assistant to help get things done, every creator will have an AI that their community can engage with, every business will have an AI that their customers can interact with to buy goods and get support, and every developer will have a state-of-the-art open source model to build with. I also think everyone will want a new category of computing devices that let you frictionlessly interact with AIs that can see what you see and hear what you hear, like smart glasses. One thing that became clearer to me in the last year is that this next generation of services requires building full general intelligence. Previously I thought that because many of the tools were social, commerce, or maybe media-oriented that it might be possible to deliver these products by solving only a subset of AI’s challenges. But now it’s clear that we’re going to need our models to be able to reason, plan, code, remember, and many other cognitive abilities in order to provide the best versions of the services that we envision. We’ve been working on general intelligence research in FAIR for more than a decade, but now general intelligence will be the theme of our product work as well. Meta has a long history of building new technologies into our services and we have a clear long-term playbook for becoming leaders. There are a few key aspects of this that I want to take some time to go through today. The first is world-class compute infrastructure. I recently shared that by the end of this year we’ll have about 350k H100s and including other GPUs that’ll be around 600k H100 equivalents of compute. We’re well-positioned now because of the lessons that we learned from Reels. We initially under-built our GPU clusters for Reels, and when we were going through that I decided that we should build enough capacity to support both Reels and another Reels-sized AI service that we expected to emerge so we wouldn’t be in that situation again. And at the time the decision was somewhat controversial and we faced a lot of questions about capex spending, but I’m really glad that we did this. Going forward, we think that training and operating future models will be even more compute intensive. We don’t have a clear expectation for exactly how much this will be yet, but the trend has been that state-of-the-art large language models have been trained on roughly 10x the amount of compute each year. Our training clusters are only part of our overall infrastructure and the rest obviously isn’t growing as quickly. But overall, we’re playing to win here and I expect us to continue investing aggressively in this area. In order to build the most advanced clusters, we’re also designing novel data centers and designing our own custom silicon specialized for our workloads. The second part of our playbook is open source software infrastructure. Our long-standing strategy has been to build and open source general infrastructure while keeping our specific product implementations proprietary. In the case of AI, the general infrastructure includes our Llama models, including Llama 3 which is training now and is looking great so far, as well as industry-standard tools like PyTorch that we’ve developed. This approach to open source has unlocked a lot of innovation across the industry and it’s something that we believe in deeply. I know some people have questions about how we benefit from open sourcing the results of our research and large amounts of compute, so I thought it might be useful to lay out the strategic benefits here. The short version is that open sourcing improves our models, and because there’s still significant work to turn our models into products and because there will be other open source models available anyway, we find there are mostly advantages to being the open source leader and it doesn’t remove differentiation from our products much anyway. Now, more specifically, there are several strategic benefits. First, open source software is typically safer and more secure, as well as more compute efficient to operate due to all the ongoing feedback, scrutiny, and development from the community. This is a big deal because safety is one of the most important issues in AI. Efficiency improvements and lowering the compute costs also benefit everyone including us. Second, open source software often becomes an industry standard, and when companies standardize on building with our stack, that then becomes easier to integrate new innovations into our products. That’s subtle, but the ability to learn and improve quickly is a huge advantage and being an industry standard enables that. Third, open source is hugely popular with developers and researchers. We know that people want to work on open systems that will be widely adopted, so this helps us recruit the best people at Meta, which is a very big deal for leading in any new technology area. And again, we typically have unique data and build unique product integrations anyway, so providing infrastructure like Llama as open source doesn’t reduce our main advantages. This is why our long-standing strategy has been to open source general infrastructure and why I expect it to continue to be the right approach for us going forward. The next part of our playbook is just taking a long-term approach towards development. While we’re working on today’s products and models, we’re also working on the research that we need to advance for Llama 5, 6, and 7 in the coming years and beyond to develop full general intelligence. It’s important to have a portfolio of multi-year investments and research projects. But it’s also important to have clear launch vehicles like future Llama models that help focus our work. We’ve worked on general intelligence in our lab FAIR for more than a decade as I mentioned and we’ve produced a lot of valuable work. But having clear product targets for delivering general intelligence really focuses this work and helps us build the leading research program. The next key part of our playbook is learning from unique data and feedback loops in our products. When people think about data, they typically think about the corpus that you might use to train a model up front. On Facebook and Instagram there are hundreds of billions of publicly shared images and tens of billions of public videos, which we estimate is greater than the Common Crawl dataset and people share large numbers of public text posts in comments across our services as well. But even more important than the upfront training corpus is the ability to establish the right feedback loops with hundreds of millions of people interacting with AI services across our products. This feedback is a big part of how we’ve improved our AI systems so quickly with Reels and ads especially over the last couple of years when we had to rearchitect it around new rules. That brings me to the last part of our playbook for building leading services, which is our culture of rapid learning and experimentation across our apps. When we decide that a new technology like AI-recommended Reels is going to be an important part of the future, we’re not shy about having multiple teams experimenting with different versions across our apps until we get it right — and then we learn what works and roll it out to everyone. There used to be this meme that we’d probably launch Stories on our Settings page at some point. And look, I think it’s funny because it gets to a core part of our approach. We start by learning and tuning our products until they perform the way we want, and then we roll them out very broadly. Sometimes, occasionally, products blow up before we’re ready for them to, like Threads, although I’ll note that Threads now has more people actively using it than it did during its initial launch peak — so that one’s on track I think to be a major success. But normally, we learn and iterate methodically. We started doing that with our AI services in the fall, launching Meta AI, our assistant, AI Studio, which is the precursor to Creator AIs, our alpha with business AIs, and Ray-Ban Meta smart glasses. We’ve been tuning each of these and we’re getting closer to rolling them out widely. So you should expect that in the coming months. From there, we’ll focus on rolling out services until they reach hundreds of millions or billions of people. And usually only when we reach that kind of scale do we start focusing on what monetization will look like. Although in this case, the way business AIs will help business messaging grow in WhatsApp, Messenger, and Instagram is pretty clear. But that’s our basic approach, and I’m really excited about pointing our company at developing so many of these awesome things. Now we have two major parts of our long term vision, and in addition to AI, the other part is the metaverse. We’ve invested heavily in both AI and the metaverse for a long time, and will continue to do so. These days there are a lot of questions more about AI that I get, and that field is moving very quickly, but I still expect this next generation of AR, MR, and VR computing platforms to deliver a realistic sense of presence that will be the foundation for the future of social experiences and almost every other category as well. Reality Labs crossed $1 billion in revenue in Q4 for the first time, with Quest having a strong holiday season. Quest 3 is off to a strong start and I expect it to continue to be the most popular mixed reality device. With Quest 3 and Quest 2 both performing well, we saw the Quest app was actually the most downloaded app in the app store on Christmas Day. I want to give a shout out to Asgard’s Wrath 2, which was developed by one of our in-house studios and received IGN’s 10/10 ‘masterpiece’ rating, making it one of the best rated games out there – not just in VR, but in any game, on any platform, ever. It’s a really good sign that we’re able to deliver that quality of work at Meta. Horizon is growing quickly too. It’s now in the top 10 most-used apps on Quest and we have an exciting roadmap ahead. This is another example of applying the long-term playbook I discussed earlier with AI, but in another area. We take the time to build up the core tech and tune the experience, and then when it’s ready we’re good at growing things. Our focus for this year will be on growing the mobile version of Horizon as well as the VR one. Ray-Ban Meta smart glasses are also off to a very strong start, both in sales and engagement. Our partner EssilorLuxottica is already planning on making more than we’d both expected due to high demand. Engagement and retention are also significantly higher than the first version of the glasses. The experience is just a lot better with Meta AI in there, as well as a higher resolution camera, better audio, and more. We also have an exciting roadmap of software improvements ahead, starting with rolling out multimodal AI and then some other really exciting new AI features later in the year. I said this before, but I think that people will want new categories of devices that let you frictionlessly engage with AIs frequently throughout the day without having to take out your phone and press a button and point it at you want it to see. I think that smart glasses are going to be a compelling form factor for this, and it’s a good example of how our AI and metaverse visions are connected. In addition to AI and the metaverse, we’re continuing to improve our apps and ads businesses as well. Reels and our discovery engine remain a priority and major driver of engagement, and messaging continues to be our focus for building the next revenue pillar of our business before our longer term work reaches scale. But since I went a bit longer on the other areas today, I’m just going to mention a few highlights here. Reels continues to do very well across both Instagram and Facebook. People reshare Reels 3.5 billion times every day. Reels is now contributing to our net revenue across our apps. The biggest opportunity going forward is unifying our recommendations systems across Reels and other types of video. That will help people discover the best content across our systems no matter what format it’s in. WhatsApp is also doing very well and the most exciting new trend here is that it’s succeeding more broadly in the United States where there’s a real appetite for a private, secure, and cross-platform messaging app that everyone can use. Given the strategic importance of the US and its outsized importance for revenue, this is just a huge opportunity. Threads is growing steadily with more than 130 million monthly actives. I’m optimistic we can keep the pace of improvements and growth going, and show that a friendly discussion-oriented app can be as widely used as the most popular social apps. All right, so that’s what I wanted to cover today. Our communities are growing and our business is back on track. Once again, a big thank you to all of our employees, partners, shareholders and everyone in our community for sticking with us and for making 2023 such a success. I’m looking forward to another exciting year ahead.

Mark Zuckerbergさんの投稿 2024年2月1日木曜日

上記がQ4の発表のトランススクリプトです。Threadsについて言及した特に重要なのは以下です。

Threads is growing steadily with more than 130 million monthly actives. I’m optimistic we can keep the pace of improvements and growth going, and show that a friendly discussion-oriented app can be as widely used as the most popular social apps. All right, so that’s what I wanted to cover today. Our communities are growing and our business is back on track.

2023年12月末時点で、月間アクティブユーザー数1.3億人に到達したとのことです。

報道: Threadsの多数の機能更新のユーザー復帰によるMAU1.2億人のユーザー数拡大 | GNU social JP Web」で前回9月末のQ3終了時点で1.2億人のMAUでした。3か月で1000万人増えたようです。Blueskyが同じくらいの期間200から300万人にユーザー数が増えたようですが、オーダーが桁違います。

Facebook/Instagram/Messenger/WhatsAppのアプリファミリーで、毎日31.9億人が使用し、2023年9月末から5000万人増加しました。

また、アプリのダウンロード数も好調です。Threadsのダウンロード数はApp Store/Google Playの合計で6位で、X/Twitterは36位でした。Threadsの開始当初は批判の声もありましたが、今回の発表でそれは杞憂に終わりそうです。このまま確実に規模を増やしていきそうです。

Comments

  1. This Article was mentioned on web.gnusocial.jp

Ads Blocker Image Powered by Code Help Pro

広告ブロッカー検知/Ads Blocker Detected

このサイトは会費と広告で運営されています。[Bronze=月220円以上に登録] するか、広告ブロッカーを無効にしてください。

This site is operated by membership and advertise. Please [register at least Bronze=220 JPY/month], or disable ads blocker.

Copied title and URL