Get live statistics and analysis of Ross Taylor 's profile on X / Twitter Ross Taylor @rosstaylor90 Building @GenReasoning . Previously lots of other things like: reasoning lead Meta AI, Llama 3/2, Galactica, Papers with Code.
1k following 10k followers
Archetype analysis The Thought Leader Ross Taylor is a pioneering force in AI research, deeply invested in advancing scientific understanding through innovative reasoning models. With a rich history at top labs such as Meta AI, he openly reflects on the trials and triumphs behind cutting-edge projects like Galactica and LLaMA. His tweets reveal a thoughtful, transparent communicator who balances technical depth with candid insights on the AI research ecosystem.
đ„ Roast Ross is the guy who spends hours debating the lifecycle of PPO rewards but might forget that most people just want to know if the robot can tell a joke without crashing. Genius-level deep dives, but sometimes youâre preaching to the choir of eight people while the rest of us just want the TLDR.
âĄïž Nice achievement Spearheading the Galactica project and gracefully owning up to its public launch challenges while ensuring the foundational work powered subsequent breakthroughs like LLaMA 2 is a testament to his leadership and scientific resilience.
đ Life's purpose To push the boundaries of AI reasoning and research transparency, enabling the AI community to build smarter, more reliable models through open dialogue and rigorous science.
đŹ Values and Beliefs Ross values openness in research, scientific integrity, and the importance of learning from both successes and failures. He believes in sharing knowledge freely to drive collective progress, while acknowledging the complexities and limitations inherent in cutting-edge AI development.
đȘ Strength Ross excels at clear, nuanced communication about complex AI topics, combining deep technical expertise with a genuine openness about project challenges. His ability to critically analyze the AI landscape and articulate lessons learned makes him a trusted thought leader.
𫣠Weakness His candidness about past project missteps, while admirable, might sometimes fuel unnecessary controversy or misunderstandings among broader audiences less versed in AI nuances. Additionally, his detailed, technical style may limit broader engagement outside specialist circles.
âĄïž Growth audience tips To grow his audience on X, Ross should leverage his expertise with more accessible, bite-sized threads that distill complex ideas into engaging stories while maintaining his transparency. Collaborations with influencers or AMAs could also invite wider community interaction, broadening his reach beyond core AI experts.
đ Bonus Fun fact: Ross led the creation of the Galactica model with an incredibly lean team of just 8 peopleâfar fewer than typical teamsâyet still managed to outperform much larger models in its domain.
Top tweets of Ross TaylorI am the first author of the Galactica paper and have been quiet about it for a year. Maybe I will write a blog post talking about what actually happened, but if you want the TLDR:
1. Galactica was a base model trained on scientific literature and modalities.
2. We approached it with a number of hypotheses about data quality, reasoning, scientific modalities, LLM training, that hadnât been covered in the literature - you can read about these in the paper.
3. For its time, it was a good model for its domain; outperforming PaLM and Chinchilla with 10x and 2x less compute.
4. We did this with a 8 person team which is an order of magnitude fewer people than other LLM teams at the time.
5. We were overstretched and lost situational awareness at launch by releasing demo of a *base model* without checks. We were aware of what potential criticisms would be, but we lost sight of the obvious in the workload we were under.
6. One of the considerations for a demo was we wanted to understand the distribution of scientific queries that people would use for LLMs (useful for instruction tuning and RLHF). Obviously this was a free goal we gave to journalists who instead queried it outside its domain. But yes we should have known better.
7. We had a âgood faithâ assumption that weâd share the base model, warts and all, with four disclaimers about hallucinations on the demo - so people could see what it could do (openness). Again, obviously this didnât work.
8. A mistake on our part that didnât help was people treated the site like a *product*. We put our vision etc on the site, which misled about expectations. We definitely did not view it as a product! It was a base model demo.
9. Pretty much every LLM researcher Iâve talked to (including at ICML recently) was complimentary about the strength of the research, which was sadly overshadowed by the demo drama - yes this was our fault for allowing this to happen.
10. Fortunately most of the lessons and work went into LLaMA 2; the RLHF research you see in that paper is from the Galactica team. Further research coming soon that should be interesting.
Itâs a bit of a riddle because on the one hand the demo drama could have been avoided by us, but at the same time the âfake scienceâ fears were very ridiculous and despite being on HuggingFace for a year, the model hasnât caused any damage.
To reiterate: the anti-Galactica commentary was really stupid, however we should not have allowed that to even happen if we had launched it better.
I stick by the research completely - and even the demo decision, which was unprecedented openness for a big company with an LLM at the time, wasnât inherently bad - but it was just misguided given the attack vectors it opened for us.
Despite all the above, I would do it all again in a heartbeat. Better to do something and regret, then not do anything at all. Still hurts though! đ
Most engaged tweets of Ross TaylorI am the first author of the Galactica paper and have been quiet about it for a year. Maybe I will write a blog post talking about what actually happened, but if you want the TLDR:
1. Galactica was a base model trained on scientific literature and modalities.
2. We approached it with a number of hypotheses about data quality, reasoning, scientific modalities, LLM training, that hadnât been covered in the literature - you can read about these in the paper.
3. For its time, it was a good model for its domain; outperforming PaLM and Chinchilla with 10x and 2x less compute.
4. We did this with a 8 person team which is an order of magnitude fewer people than other LLM teams at the time.
5. We were overstretched and lost situational awareness at launch by releasing demo of a *base model* without checks. We were aware of what potential criticisms would be, but we lost sight of the obvious in the workload we were under.
6. One of the considerations for a demo was we wanted to understand the distribution of scientific queries that people would use for LLMs (useful for instruction tuning and RLHF). Obviously this was a free goal we gave to journalists who instead queried it outside its domain. But yes we should have known better.
7. We had a âgood faithâ assumption that weâd share the base model, warts and all, with four disclaimers about hallucinations on the demo - so people could see what it could do (openness). Again, obviously this didnât work.
8. A mistake on our part that didnât help was people treated the site like a *product*. We put our vision etc on the site, which misled about expectations. We definitely did not view it as a product! It was a base model demo.
9. Pretty much every LLM researcher Iâve talked to (including at ICML recently) was complimentary about the strength of the research, which was sadly overshadowed by the demo drama - yes this was our fault for allowing this to happen.
10. Fortunately most of the lessons and work went into LLaMA 2; the RLHF research you see in that paper is from the Galactica team. Further research coming soon that should be interesting.
Itâs a bit of a riddle because on the one hand the demo drama could have been avoided by us, but at the same time the âfake scienceâ fears were very ridiculous and despite being on HuggingFace for a year, the model hasnât caused any damage.
To reiterate: the anti-Galactica commentary was really stupid, however we should not have allowed that to even happen if we had launched it better.
I stick by the research completely - and even the demo decision, which was unprecedented openness for a big company with an LLM at the time, wasnât inherently bad - but it was just misguided given the attack vectors it opened for us.
Despite all the above, I would do it all again in a heartbeat. Better to do something and regret, then not do anything at all. Still hurts though! đ
People with Thought Leader archetype Explore Related Archetypes If you enjoy the thought leader profiles, you might also like these personality types:
{"data":{"__meta":{"device":false,"path":"/creators/rosstaylor90"},"/creators/rosstaylor90":{"data":{"user":{"id":"524807755","name":"Ross Taylor","description":"Building @GenReasoning. Previously lots of other things like: reasoning lead Meta AI, Llama 3/2, Galactica, Papers with Code.","followers_count":10270,"friends_count":1139,"statuses_count":2594,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1620286816338206720/IobQvrSe_normal.jpg","screen_name":"rosstaylor90","location":"âœÂČf = 0","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"rossjtaylor.com","expanded_url":"https://rossjtaylor.com","url":"https://t.co/ZtovfqrcwC","indices":[0,23]}]}}},"details":{"type":"The Thought Leader","description":"Ross Taylor is a pioneering force in AI research, deeply invested in advancing scientific understanding through innovative reasoning models. With a rich history at top labs such as Meta AI, he openly reflects on the trials and triumphs behind cutting-edge projects like Galactica and LLaMA. His tweets reveal a thoughtful, transparent communicator who balances technical depth with candid insights on the AI research ecosystem.","purpose":"To push the boundaries of AI reasoning and research transparency, enabling the AI community to build smarter, more reliable models through open dialogue and rigorous science.","beliefs":"Ross values openness in research, scientific integrity, and the importance of learning from both successes and failures. He believes in sharing knowledge freely to drive collective progress, while acknowledging the complexities and limitations inherent in cutting-edge AI development.","facts":"Fun fact: Ross led the creation of the Galactica model with an incredibly lean team of just 8 peopleâfar fewer than typical teamsâyet still managed to outperform much larger models in its domain.","strength":"Ross excels at clear, nuanced communication about complex AI topics, combining deep technical expertise with a genuine openness about project challenges. His ability to critically analyze the AI landscape and articulate lessons learned makes him a trusted thought leader.","weakness":"His candidness about past project missteps, while admirable, might sometimes fuel unnecessary controversy or misunderstandings among broader audiences less versed in AI nuances. Additionally, his detailed, technical style may limit broader engagement outside specialist circles.","recommendation":"To grow his audience on X, Ross should leverage his expertise with more accessible, bite-sized threads that distill complex ideas into engaging stories while maintaining his transparency. Collaborations with influencers or AMAs could also invite wider community interaction, broadening his reach beyond core AI experts.","roast":"Ross is the guy who spends hours debating the lifecycle of PPO rewards but might forget that most people just want to know if the robot can tell a joke without crashing. Genius-level deep dives, but sometimes youâre preaching to the choir of eight people while the rest of us just want the TLDR.","win":"Spearheading the Galactica project and gracefully owning up to its public launch challenges while ensuring the foundational work powered subsequent breakthroughs like LLaMA 2 is a testament to his leadership and scientific resilience."},"tweets":[{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1724438903585673225","quoted_status_permalink":{"url":"https://t.co/EPZqIly7QX","expanded":"https://twitter.com/sharongoldman/status/1724438903585673225","display":"x.com/sharongoldman/âŠ"},"retweeted":false,"fact_check":null,"id":"1724547381092573352","view_count":960463,"bookmark_count":744,"created_at":1699999110000,"favorite_count":2659,"quote_count":52,"reply_count":89,"retweet_count":309,"user_id_str":"524807755","conversation_id_str":"1724547381092573352","full_text":"I am the first author of the Galactica paper and have been quiet about it for a year. Maybe I will write a blog post talking about what actually happened, but if you want the TLDR:\n\n1. Galactica was a base model trained on scientific literature and modalities. \n2. We approached it with a number of hypotheses about data quality, reasoning, scientific modalities, LLM training, that hadnât been covered in the literature - you can read about these in the paper.\n3. For its time, it was a good model for its domain; outperforming PaLM and Chinchilla with 10x and 2x less compute.\n4. We did this with a 8 person team which is an order of magnitude fewer people than other LLM teams at the time.\n5. We were overstretched and lost situational awareness at launch by releasing demo of a *base model* without checks. We were aware of what potential criticisms would be, but we lost sight of the obvious in the workload we were under.\n6. One of the considerations for a demo was we wanted to understand the distribution of scientific queries that people would use for LLMs (useful for instruction tuning and RLHF). Obviously this was a free goal we gave to journalists who instead queried it outside its domain. But yes we should have known better.\n7. We had a âgood faithâ assumption that weâd share the base model, warts and all, with four disclaimers about hallucinations on the demo - so people could see what it could do (openness). Again, obviously this didnât work.\n8. A mistake on our part that didnât help was people treated the site like a *product*. We put our vision etc on the site, which misled about expectations. We definitely did not view it as a product! It was a base model demo.\n9. Pretty much every LLM researcher Iâve talked to (including at ICML recently) was complimentary about the strength of the research, which was sadly overshadowed by the demo drama - yes this was our fault for allowing this to happen.\n10. Fortunately most of the lessons and work went into LLaMA 2; the RLHF research you see in that paper is from the Galactica team. Further research coming soon that should be interesting.\n\nItâs a bit of a riddle because on the one hand the demo drama could have been avoided by us, but at the same time the âfake scienceâ fears were very ridiculous and despite being on HuggingFace for a year, the model hasnât caused any damage. \n\nTo reiterate: the anti-Galactica commentary was really stupid, however we should not have allowed that to even happen if we had launched it better.\n\nI stick by the research completely - and even the demo decision, which was unprecedented openness for a big company with an LLM at the time, wasnât inherently bad - but it was just misguided given the attack vectors it opened for us.\n\nDespite all the above, I would do it all again in a heartbeat. Better to do something and regret, then not do anything at all. Still hurts though! đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,198],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1714580962569588736","name":"DeepSeek","screen_name":"deepseek_ai","indices":[31,43]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1881424918132502918","view_count":45109,"bookmark_count":73,"created_at":1737401630000,"favorite_count":904,"quote_count":8,"reply_count":27,"retweet_count":61,"user_id_str":"524807755","conversation_id_str":"1881424918132502918","full_text":"Last tweet on this but the way @deepseek_ai does launches is beautiful: no hype, arrogance or vague-posting: just sharing something great with the world. US tech companies look cringe in comparison.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,160],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/sgUHsNQRXr","expanded_url":"https://x.com/rosstaylor90/status/1968355720820338894/photo/1","id_str":"1968355714306277376","indices":[161,184],"media_key":"3_1968355714306277376","media_url_https":"https://pbs.twimg.com/media/G1EBF-bWAAAOfrl.jpg","type":"photo","url":"https://t.co/sgUHsNQRXr","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1281,"w":991,"resize":"fit"},"medium":{"h":1200,"w":928,"resize":"fit"},"small":{"h":680,"w":526,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1281,"width":991,"focus_rects":[{"x":0,"y":0,"w":991,"h":555},{"x":0,"y":0,"w":991,"h":991},{"x":0,"y":0,"w":991,"h":1130},{"x":95,"y":0,"w":641,"h":1281},{"x":0,"y":0,"w":991,"h":1281}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1968355714306277376"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/sgUHsNQRXr","expanded_url":"https://x.com/rosstaylor90/status/1968355720820338894/photo/1","id_str":"1968355714306277376","indices":[161,184],"media_key":"3_1968355714306277376","media_url_https":"https://pbs.twimg.com/media/G1EBF-bWAAAOfrl.jpg","type":"photo","url":"https://t.co/sgUHsNQRXr","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1281,"w":991,"resize":"fit"},"medium":{"h":1200,"w":928,"resize":"fit"},"small":{"h":680,"w":526,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1281,"width":991,"focus_rects":[{"x":0,"y":0,"w":991,"h":555},{"x":0,"y":0,"w":991,"h":991},{"x":0,"y":0,"w":991,"h":1130},{"x":95,"y":0,"w":641,"h":1281},{"x":0,"y":0,"w":991,"h":1281}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1968355714306277376"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1968355720820338894","view_count":160087,"bookmark_count":860,"created_at":1758127548000,"favorite_count":903,"quote_count":20,"reply_count":10,"retweet_count":147,"user_id_str":"524807755","conversation_id_str":"1968355720820338894","full_text":"Supplementary information for the new DeepSeek R1 Nature paper is very interesting!\n\nDetails on training data, hyperparameters, base model importance, and more. https://t.co/sgUHsNQRXr","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,277],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[1677,1690]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1959494279077728549","view_count":110507,"bookmark_count":592,"created_at":1756014816000,"favorite_count":709,"quote_count":11,"reply_count":30,"retweet_count":46,"user_id_str":"524807755","conversation_id_str":"1959494279077728549","full_text":"Most takes on RL environments are bad.\n\n1. There are hardly any high-quality RL environments and evals available. Most agentic environments and evals are flawed when you look at the details. Itâs a crisis: and no one is talking about it because theyâre being hoodwinked by labs marketing their models on flawed evals.\n\n2. Even the best public RL environments and agentic evals suck, and usually canât be used by labs without modification. Academics often publish-and-forget instead of doing the necessary follow-up work to make the envs/evals useful for labs.\n\n3. The best person to make an environment is someone deeply knowledgeable about a field, not a high-level generalist or newbie - đŠ not đŠ - but most envs are being made by generalists or low-skill contractors.\n\n4. People are too focused on whether a problem is verifiable or not, not what kind of capabilities they want to bring into being. We donât need more math and puzzle environments. The usefulness of an environment is proportional to its difficulty of construction.\n\n5. Saying you want to âscale RL environmentsâ is as meaningless as âscale is all you needâ in that it says nothing about your choice of what to scale. \n\n6. People are treating RL environment scaling as a new type of pretraining (creating a new internet), but pretraining has extremely high diversity, and expecting a single company (or collection of companies) to replicate this diversity is unrealistic. That means generalisation will be slower to emerge than the previous paradigm - and so there is more leverage in choosing which environments to build first.\n\nIf youâd like to help answer the right questions in this new space, join us at @GenReasoning.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1886505669622149139","quoted_status_permalink":{"url":"https://t.co/zfXtPYpSgB","expanded":"https://twitter.com/rdolmedo_/status/1886505669622149139","display":"x.com/rdolmedo_/statâŠ"},"retweeted":false,"fact_check":null,"id":"1886625126222852208","view_count":134109,"bookmark_count":389,"created_at":1738641456000,"favorite_count":664,"quote_count":14,"reply_count":28,"retweet_count":87,"user_id_str":"524807755","conversation_id_str":"1886625126222852208","full_text":"No one is saying RL didnât work for reasoning. The argument is about internal reasoning emergence, not absolute performance boosts with RL.\n\nQuite the opposite in fact - we had PPO on Llama 2 base models 2 years ago with verifiable rewards and had 90%+ on GSM8k. We already knew that RL worked. Hell we even did this with the bronze-age Galactica model and it also worked (see my ICML talk 3 years ago).\n\nWhat didnât âworkâ was that we didnât see emergence of longer traces with backtracking, checking, error-correction and other branching-like behaviour.\n\nSo when people say âthe base model wasnât strong enoughâ, the contention is that there wasnât enough knowledge in the base model, subsequent RL compute and context window length for the model to develop long CoT, internal reasoning on its own.\n\nThatâs the argument, not whether âRL workedâ. It always did, but the magic seems to have emerged relatively recently.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1945328176185614481","view_count":50600,"bookmark_count":101,"created_at":1752637354000,"favorite_count":405,"quote_count":8,"reply_count":23,"retweet_count":22,"user_id_str":"524807755","conversation_id_str":"1945328176185614481","full_text":"Itâs funny that people on this site think major LLM efforts are talent-bound rather than org-bound.\n\nThe talent differential has never been big between major orgs. Most of the difference in outcomes is due to organisational factors - like allocating compute to the right bets, and letting good research and engineering triumph over destructive politics.\n\nThis makes for a less sexy story though. People prefer to believe that breakthroughs are made by lone geniuses - instead of the cumulative effort of many nameless, social media averse people â supported by an org that allows the best ideas to win and manages big egos. \n\nIf you donât believe me - then consider how some researchers suddenly gain or lose impact and productivity when they switch orgs. Was it because they gained or lost IQ points? đ\n\n(Sorry, this is super obvious to anyone whoâs actually worked in these labs - but you wouldnât believe it based on the X feed right now!)","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,276],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1732660216439664811","view_count":123694,"bookmark_count":375,"created_at":1701933361000,"favorite_count":393,"quote_count":7,"reply_count":50,"retweet_count":36,"user_id_str":"524807755","conversation_id_str":"1732660216439664811","full_text":"Why are LLMs bad at reasoning?\n\nOne theory says this is due to weaknesses in maximum likelihood, where the probability mass âovergeneralisesâ to low quality solutions.\n\nBecause our pretraining objective (likelihood) doesnât transfer to our evaluation objective (accuracy), the theory goes that we need reinforce high quality solutions to fix the problem.\n\nBut this theory is probably incorrect for reasoning in academic subjects. The internet is heavily biased towards examples of correct solutions - textbooks, incentive-aligned sites like StackExchange, etc. So poor performance is unlikely to be explained by the prevalence of incorrect solutions.\n\nInstead the problem is that reasoning is a task which requires high precision. So it is harder to generalise from solutions of seen problems to solutions for unseen problems.\n\nAnd once you make a mistake, you are conditioning on an unlikely sequence of tokens (dissimilar to what appears in training) so errors compound.\n\nThat means we need an order of magnitude more compute to get reasoning to the level of precision required to perform well compared to other tasks.\n\nThis is also why reasoning has been the âlast to scaleâ of the classical LLM tasks, and why MATH has been the hardest benchmark to excel at.\n\nBut high precision tasks remain: as we move into more agentic settings and task horizon increases, LLMs will need to reason over much longer time periods - where similar problems will apply.\n\nIt seems unlikely that simply more training FLOPs will solve the problem. Eventually weâll need to lean on search again as a way to find better outputs and achieve high precision.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,279],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1909515557159792794","view_count":41574,"bookmark_count":53,"created_at":1744098960000,"favorite_count":322,"quote_count":3,"reply_count":15,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1909515557159792794","full_text":"Maybe OpenAI had a point with âhigh taste testersâ.\n\nI didnât like the phrase initially because it felt a little elitist. But maybe I can reconcile with it by treating âhigh tasteâ as folks who care more about the outputs they are getting, and scrutinise them more carefully.\n\nIn other words: optimise models for the users who care the most / who spend more glucose scrutinising your outputs.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,276],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1976397439340101698","view_count":28072,"bookmark_count":209,"created_at":1760044843000,"favorite_count":276,"quote_count":3,"reply_count":9,"retweet_count":30,"user_id_str":"524807755","conversation_id_str":"1976397439340101698","full_text":"RL is not enough. It only reaches its potential when combined with other ideas.\n\nThe most famous example is AlphaZero. RL was combined with self-play which created an implicit task curriculum that evolved through training. This is very different from many RL datasets for LLMs which have a fixed set of tasks.\n\nEven where the task set is fixed, RL still needs to be combined with other ideas to show signs of life. Thinking models only come to life through RL when we remove length penalisation and have enough prior knowledge in the training data.\n\nLooking ahead, long horizon tasks will require much more exploration and âgoing off-pisteâ. But current RL methods induce policy entropy collapse. Diverse mid-training before RL could help, but fundamentally current RL objectives donât reward âinterestingnessâ and deviating from the âcurrent (high reward) thingâ.\n\nAnd yet, discovery is all about deviating from the current thing - and the best ideas come from the wilderness. Deep learning is no exception. It was interesting long before it reached its true potential. We didnât wait to introduce backprop and SGD until compute and data came online :).\n\nThis is a long way of saying: crude RL maximalism is overhyped. The magic comes from the interplay of RL with other things, and the angels are in the details.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,274],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1881318142083018951","quoted_status_permalink":{"url":"https://t.co/mNXfj5pscU","expanded":"https://twitter.com/deepseek_ai/status/1881318142083018951","display":"x.com/deepseek_ai/stâŠ"},"retweeted":false,"fact_check":null,"id":"1881372810485899716","view_count":39247,"bookmark_count":154,"created_at":1737389206000,"favorite_count":275,"quote_count":1,"reply_count":9,"retweet_count":20,"user_id_str":"524807755","conversation_id_str":"1881372810485899716","full_text":"R1 paper and work truly excellent. Questions about the reward shaping + training data:\n\n1. The paper says the reward âmainly consists of accuracy and format rewardsâ. Is this it or do they have any other rewards to incentivise longer traces?\n\n2. How many unique (verifiable) questions do they need for things to kick into gear in the RL stage?\n\nBut I am very glad they have helped put to bed PRMs, MCTS and all that excess complexity! đȘŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657885598875648","indices":[281,304],"media_key":"3_1887657885598875648","media_url_https":"https://pbs.twimg.com/media/GjJO2ICWAAAWosG.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1016,"w":1644,"resize":"fit"},"medium":{"h":742,"w":1200,"resize":"fit"},"small":{"h":420,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1016,"width":1644,"focus_rects":[{"x":0,"y":73,"w":1644,"h":921},{"x":271,"y":0,"w":1016,"h":1016},{"x":334,"y":0,"w":891,"h":1016},{"x":525,"y":0,"w":508,"h":1016},{"x":0,"y":0,"w":1644,"h":1016}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657885598875648"}}},{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657979186495488","indices":[281,304],"media_key":"3_1887657979186495488","media_url_https":"https://pbs.twimg.com/media/GjJO7krXwAA_Pv9.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]},"medium":{"faces":[{"x":337,"y":145,"h":61,"w":61},{"x":417,"y":529,"h":80,"w":80}]},"small":{"faces":[{"x":191,"y":82,"h":34,"w":34},{"x":236,"y":300,"h":45,"w":45}]},"orig":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]}},"sizes":{"large":{"h":1306,"w":1718,"resize":"fit"},"medium":{"h":912,"w":1200,"resize":"fit"},"small":{"h":517,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1306,"width":1718,"focus_rects":[{"x":0,"y":0,"w":1718,"h":962},{"x":412,"y":0,"w":1306,"h":1306},{"x":572,"y":0,"w":1146,"h":1306},{"x":1065,"y":0,"w":653,"h":1306},{"x":0,"y":0,"w":1718,"h":1306}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657979186495488"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657885598875648","indices":[281,304],"media_key":"3_1887657885598875648","media_url_https":"https://pbs.twimg.com/media/GjJO2ICWAAAWosG.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1016,"w":1644,"resize":"fit"},"medium":{"h":742,"w":1200,"resize":"fit"},"small":{"h":420,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1016,"width":1644,"focus_rects":[{"x":0,"y":73,"w":1644,"h":921},{"x":271,"y":0,"w":1016,"h":1016},{"x":334,"y":0,"w":891,"h":1016},{"x":525,"y":0,"w":508,"h":1016},{"x":0,"y":0,"w":1644,"h":1016}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657885598875648"}}},{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657979186495488","indices":[281,304],"media_key":"3_1887657979186495488","media_url_https":"https://pbs.twimg.com/media/GjJO7krXwAA_Pv9.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]},"medium":{"faces":[{"x":337,"y":145,"h":61,"w":61},{"x":417,"y":529,"h":80,"w":80}]},"small":{"faces":[{"x":191,"y":82,"h":34,"w":34},{"x":236,"y":300,"h":45,"w":45}]},"orig":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]}},"sizes":{"large":{"h":1306,"w":1718,"resize":"fit"},"medium":{"h":912,"w":1200,"resize":"fit"},"small":{"h":517,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1306,"width":1718,"focus_rects":[{"x":0,"y":0,"w":1718,"h":962},{"x":412,"y":0,"w":1306,"h":1306},{"x":572,"y":0,"w":1146,"h":1306},{"x":1065,"y":0,"w":653,"h":1306},{"x":0,"y":0,"w":1718,"h":1306}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657979186495488"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1887659393237307459","view_count":41378,"bookmark_count":322,"created_at":1738888045000,"favorite_count":268,"quote_count":3,"reply_count":10,"retweet_count":22,"user_id_str":"524807755","conversation_id_str":"1887659393237307459","full_text":"The aha moment is indeed overrated. \n\nThe internal monologue style of R1 appears to come from SFT. The length scaling from R1-Zero appears to come from learning to verify after answering. These are two separate things!\n\nThis leads to a very different character of reasoning trace. R1-Zero looks like a final answer; R1-Zero looks like internal thought.\n\nExample 1: R1-Zero on the left; R1 on the right. R1-Zero looks like a typical CoT trace...R1 looks like o1-style branching (\"wait, no\").\n\nSo if R1-Zero looks like a regular CoT trace, then what is different? đ§”...","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[83,96]},{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[83,96]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1957900422867480726","view_count":41747,"bookmark_count":118,"created_at":1755634811000,"favorite_count":249,"quote_count":1,"reply_count":11,"retweet_count":28,"user_id_str":"524807755","conversation_id_str":"1957900422867480726","full_text":"If youâre working at an AI lab in London and looking for a new gig - Iâm hiring at @GenReasoning!\n\nSmall, highly collaborative, kind team working on frontier research topics. No politics (imagine that), high levels of autonomy and a chance to shape entirely new capabilities for AI.\n\nWeâre having a blast already but always on the lookout for good people to join the fun đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,277],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1892983129528222111","quoted_status_permalink":{"url":"https://t.co/imKvQdOs1k","expanded":"https://twitter.com/genreasoning/status/1892983129528222111","display":"x.com/genreasoning/sâŠ"},"retweeted":false,"fact_check":null,"id":"1892983452003082605","view_count":29502,"bookmark_count":159,"created_at":1740157399000,"favorite_count":245,"quote_count":4,"reply_count":7,"retweet_count":34,"user_id_str":"524807755","conversation_id_str":"1892983452003082605","full_text":"đ Excited to release General Reasoning: a new community resource for building open reasoning models. \n\nWeâre looking to make personal, open reasoners a reality. Starting with a small step in that direction today!\n\nRead the thread in the quote tweet for details, or my personal analysis below!","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1911143014258405420","quoted_status_permalink":{"url":"https://t.co/4cug1QpXsP","expanded":"https://twitter.com/wenhuchen/status/1911143014258405420","display":"x.com/wenhuchen/statâŠ"},"retweeted":false,"fact_check":null,"id":"1911332863577928008","view_count":25136,"bookmark_count":164,"created_at":1744532240000,"favorite_count":204,"quote_count":1,"reply_count":8,"retweet_count":24,"user_id_str":"524807755","conversation_id_str":"1911332863577928008","full_text":"Distillation outperforms RL recipes at smaller scales.\n\nOne simple way to think about this is a large model is a better way to discover good data than a smaller one. A point made in original DeepSeek-R1 paper, but missed by many in the wake of that release in the GRPO wave.\n\nThat being said, thereâs still an opportunity to do effective RL for smaller models. But hinges on getting details right to make training more efficient, including filtering the difficulty of the RL corpus and several algorithmic tweaks (eg overlong filtering).\n\nI also think the community is overoptimising benchmarks like AIME, MATH500 and GPQA. Itâs not very interesting to report a model doing well on just these benchmarks anymore. What matters is if the recipe generalises to more benchmarks that donât have heavy optimisation pressure applied to them.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,188],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1906836403096629507","quoted_status_permalink":{"url":"https://t.co/NZcQdwwdRE","expanded":"https://twitter.com/slow_developer/status/1906836403096629507","display":"x.com/slow_developerâŠ"},"retweeted":false,"fact_check":null,"id":"1906979663349428429","view_count":10914,"bookmark_count":7,"created_at":1743494356000,"favorite_count":182,"quote_count":2,"reply_count":10,"retweet_count":3,"user_id_str":"524807755","conversation_id_str":"1906979663349428429","full_text":"Sorry, but âwe have internal models that are betterâ public statements are very lame - especially when Microsoft doesnât have a track record of delivering LLMs that are anywhere near SoTA.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1977594077706420339","quoted_status_permalink":{"url":"https://t.co/h0yqKXZ8YS","expanded":"https://twitter.com/harjtaggar/status/1977594077706420339","display":"x.com/harjtaggar/staâŠ"},"retweeted":false,"fact_check":null,"id":"1977641594619609405","view_count":17519,"bookmark_count":21,"created_at":1760341473000,"favorite_count":179,"quote_count":0,"reply_count":9,"retweet_count":11,"user_id_str":"524807755","conversation_id_str":"1977641594619609405","full_text":"Most of the folks who were early on AI cared about AI safety from the outset.\n\nMost of the anti AI safety takes I see are from folks who only got involved after ChatGPT - once the technology had been de-risked by others.\n\nSo revision:\n\nâThose who take AI seriously care about AI safety; those who only care about short-term profits donâtâ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}],"ctweets":[{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1724438903585673225","quoted_status_permalink":{"url":"https://t.co/EPZqIly7QX","expanded":"https://twitter.com/sharongoldman/status/1724438903585673225","display":"x.com/sharongoldman/âŠ"},"retweeted":false,"fact_check":null,"id":"1724547381092573352","view_count":960463,"bookmark_count":744,"created_at":1699999110000,"favorite_count":2659,"quote_count":52,"reply_count":89,"retweet_count":309,"user_id_str":"524807755","conversation_id_str":"1724547381092573352","full_text":"I am the first author of the Galactica paper and have been quiet about it for a year. Maybe I will write a blog post talking about what actually happened, but if you want the TLDR:\n\n1. Galactica was a base model trained on scientific literature and modalities. \n2. We approached it with a number of hypotheses about data quality, reasoning, scientific modalities, LLM training, that hadnât been covered in the literature - you can read about these in the paper.\n3. For its time, it was a good model for its domain; outperforming PaLM and Chinchilla with 10x and 2x less compute.\n4. We did this with a 8 person team which is an order of magnitude fewer people than other LLM teams at the time.\n5. We were overstretched and lost situational awareness at launch by releasing demo of a *base model* without checks. We were aware of what potential criticisms would be, but we lost sight of the obvious in the workload we were under.\n6. One of the considerations for a demo was we wanted to understand the distribution of scientific queries that people would use for LLMs (useful for instruction tuning and RLHF). Obviously this was a free goal we gave to journalists who instead queried it outside its domain. But yes we should have known better.\n7. We had a âgood faithâ assumption that weâd share the base model, warts and all, with four disclaimers about hallucinations on the demo - so people could see what it could do (openness). Again, obviously this didnât work.\n8. A mistake on our part that didnât help was people treated the site like a *product*. We put our vision etc on the site, which misled about expectations. We definitely did not view it as a product! It was a base model demo.\n9. Pretty much every LLM researcher Iâve talked to (including at ICML recently) was complimentary about the strength of the research, which was sadly overshadowed by the demo drama - yes this was our fault for allowing this to happen.\n10. Fortunately most of the lessons and work went into LLaMA 2; the RLHF research you see in that paper is from the Galactica team. Further research coming soon that should be interesting.\n\nItâs a bit of a riddle because on the one hand the demo drama could have been avoided by us, but at the same time the âfake scienceâ fears were very ridiculous and despite being on HuggingFace for a year, the model hasnât caused any damage. \n\nTo reiterate: the anti-Galactica commentary was really stupid, however we should not have allowed that to even happen if we had launched it better.\n\nI stick by the research completely - and even the demo decision, which was unprecedented openness for a big company with an LLM at the time, wasnât inherently bad - but it was just misguided given the attack vectors it opened for us.\n\nDespite all the above, I would do it all again in a heartbeat. Better to do something and regret, then not do anything at all. Still hurts though! đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,276],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1732660216439664811","view_count":123694,"bookmark_count":375,"created_at":1701933361000,"favorite_count":393,"quote_count":7,"reply_count":50,"retweet_count":36,"user_id_str":"524807755","conversation_id_str":"1732660216439664811","full_text":"Why are LLMs bad at reasoning?\n\nOne theory says this is due to weaknesses in maximum likelihood, where the probability mass âovergeneralisesâ to low quality solutions.\n\nBecause our pretraining objective (likelihood) doesnât transfer to our evaluation objective (accuracy), the theory goes that we need reinforce high quality solutions to fix the problem.\n\nBut this theory is probably incorrect for reasoning in academic subjects. The internet is heavily biased towards examples of correct solutions - textbooks, incentive-aligned sites like StackExchange, etc. So poor performance is unlikely to be explained by the prevalence of incorrect solutions.\n\nInstead the problem is that reasoning is a task which requires high precision. So it is harder to generalise from solutions of seen problems to solutions for unseen problems.\n\nAnd once you make a mistake, you are conditioning on an unlikely sequence of tokens (dissimilar to what appears in training) so errors compound.\n\nThat means we need an order of magnitude more compute to get reasoning to the level of precision required to perform well compared to other tasks.\n\nThis is also why reasoning has been the âlast to scaleâ of the classical LLM tasks, and why MATH has been the hardest benchmark to excel at.\n\nBut high precision tasks remain: as we move into more agentic settings and task horizon increases, LLMs will need to reason over much longer time periods - where similar problems will apply.\n\nIt seems unlikely that simply more training FLOPs will solve the problem. Eventually weâll need to lean on search again as a way to find better outputs and achieve high precision.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,277],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[1677,1690]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1959494279077728549","view_count":110507,"bookmark_count":592,"created_at":1756014816000,"favorite_count":709,"quote_count":11,"reply_count":30,"retweet_count":46,"user_id_str":"524807755","conversation_id_str":"1959494279077728549","full_text":"Most takes on RL environments are bad.\n\n1. There are hardly any high-quality RL environments and evals available. Most agentic environments and evals are flawed when you look at the details. Itâs a crisis: and no one is talking about it because theyâre being hoodwinked by labs marketing their models on flawed evals.\n\n2. Even the best public RL environments and agentic evals suck, and usually canât be used by labs without modification. Academics often publish-and-forget instead of doing the necessary follow-up work to make the envs/evals useful for labs.\n\n3. The best person to make an environment is someone deeply knowledgeable about a field, not a high-level generalist or newbie - đŠ not đŠ - but most envs are being made by generalists or low-skill contractors.\n\n4. People are too focused on whether a problem is verifiable or not, not what kind of capabilities they want to bring into being. We donât need more math and puzzle environments. The usefulness of an environment is proportional to its difficulty of construction.\n\n5. Saying you want to âscale RL environmentsâ is as meaningless as âscale is all you needâ in that it says nothing about your choice of what to scale. \n\n6. People are treating RL environment scaling as a new type of pretraining (creating a new internet), but pretraining has extremely high diversity, and expecting a single company (or collection of companies) to replicate this diversity is unrealistic. That means generalisation will be slower to emerge than the previous paradigm - and so there is more leverage in choosing which environments to build first.\n\nIf youâd like to help answer the right questions in this new space, join us at @GenReasoning.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1886505669622149139","quoted_status_permalink":{"url":"https://t.co/zfXtPYpSgB","expanded":"https://twitter.com/rdolmedo_/status/1886505669622149139","display":"x.com/rdolmedo_/statâŠ"},"retweeted":false,"fact_check":null,"id":"1886625126222852208","view_count":134109,"bookmark_count":389,"created_at":1738641456000,"favorite_count":664,"quote_count":14,"reply_count":28,"retweet_count":87,"user_id_str":"524807755","conversation_id_str":"1886625126222852208","full_text":"No one is saying RL didnât work for reasoning. The argument is about internal reasoning emergence, not absolute performance boosts with RL.\n\nQuite the opposite in fact - we had PPO on Llama 2 base models 2 years ago with verifiable rewards and had 90%+ on GSM8k. We already knew that RL worked. Hell we even did this with the bronze-age Galactica model and it also worked (see my ICML talk 3 years ago).\n\nWhat didnât âworkâ was that we didnât see emergence of longer traces with backtracking, checking, error-correction and other branching-like behaviour.\n\nSo when people say âthe base model wasnât strong enoughâ, the contention is that there wasnât enough knowledge in the base model, subsequent RL compute and context window length for the model to develop long CoT, internal reasoning on its own.\n\nThatâs the argument, not whether âRL workedâ. It always did, but the magic seems to have emerged relatively recently.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,198],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1714580962569588736","name":"DeepSeek","screen_name":"deepseek_ai","indices":[31,43]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1881424918132502918","view_count":45109,"bookmark_count":73,"created_at":1737401630000,"favorite_count":904,"quote_count":8,"reply_count":27,"retweet_count":61,"user_id_str":"524807755","conversation_id_str":"1881424918132502918","full_text":"Last tweet on this but the way @deepseek_ai does launches is beautiful: no hype, arrogance or vague-posting: just sharing something great with the world. US tech companies look cringe in comparison.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1945328176185614481","view_count":50600,"bookmark_count":101,"created_at":1752637354000,"favorite_count":405,"quote_count":8,"reply_count":23,"retweet_count":22,"user_id_str":"524807755","conversation_id_str":"1945328176185614481","full_text":"Itâs funny that people on this site think major LLM efforts are talent-bound rather than org-bound.\n\nThe talent differential has never been big between major orgs. Most of the difference in outcomes is due to organisational factors - like allocating compute to the right bets, and letting good research and engineering triumph over destructive politics.\n\nThis makes for a less sexy story though. People prefer to believe that breakthroughs are made by lone geniuses - instead of the cumulative effort of many nameless, social media averse people â supported by an org that allows the best ideas to win and manages big egos. \n\nIf you donât believe me - then consider how some researchers suddenly gain or lose impact and productivity when they switch orgs. Was it because they gained or lost IQ points? đ\n\n(Sorry, this is super obvious to anyone whoâs actually worked in these labs - but you wouldnât believe it based on the X feed right now!)","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,274],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1884966602724295026","quoted_status_permalink":{"url":"https://t.co/liPiItN552","expanded":"https://twitter.com/allen_ai/status/1884966602724295026","display":"x.com/allen_ai/statuâŠ"},"retweeted":false,"fact_check":null,"id":"1885054703160987949","view_count":20978,"bookmark_count":63,"created_at":1738267038000,"favorite_count":164,"quote_count":2,"reply_count":15,"retweet_count":19,"user_id_str":"524807755","conversation_id_str":"1885054703160987949","full_text":"More evidence for the base model hypothesis: RL more effective with a better base model. This is important as it shows why similar efforts a few years ago didnât reach escape velocity.\n\nâIf I have seen further it is only by standing on the shoulders of a better base modelâ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,279],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1909515557159792794","view_count":41574,"bookmark_count":53,"created_at":1744098960000,"favorite_count":322,"quote_count":3,"reply_count":15,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1909515557159792794","full_text":"Maybe OpenAI had a point with âhigh taste testersâ.\n\nI didnât like the phrase initially because it felt a little elitist. But maybe I can reconcile with it by treating âhigh tasteâ as folks who care more about the outputs they are getting, and scrutinise them more carefully.\n\nIn other words: optimise models for the users who care the most / who spend more glucose scrutinising your outputs.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[83,96]},{"id_str":"1795737220340428800","name":"General Reasoning","screen_name":"GenReasoning","indices":[83,96]}]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1957900422867480726","view_count":41747,"bookmark_count":118,"created_at":1755634811000,"favorite_count":249,"quote_count":1,"reply_count":11,"retweet_count":28,"user_id_str":"524807755","conversation_id_str":"1957900422867480726","full_text":"If youâre working at an AI lab in London and looking for a new gig - Iâm hiring at @GenReasoning!\n\nSmall, highly collaborative, kind team working on frontier research topics. No politics (imagine that), high levels of autonomy and a chance to shape entirely new capabilities for AI.\n\nWeâre having a blast already but always on the lookout for good people to join the fun đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657885598875648","indices":[281,304],"media_key":"3_1887657885598875648","media_url_https":"https://pbs.twimg.com/media/GjJO2ICWAAAWosG.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1016,"w":1644,"resize":"fit"},"medium":{"h":742,"w":1200,"resize":"fit"},"small":{"h":420,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1016,"width":1644,"focus_rects":[{"x":0,"y":73,"w":1644,"h":921},{"x":271,"y":0,"w":1016,"h":1016},{"x":334,"y":0,"w":891,"h":1016},{"x":525,"y":0,"w":508,"h":1016},{"x":0,"y":0,"w":1644,"h":1016}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657885598875648"}}},{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657979186495488","indices":[281,304],"media_key":"3_1887657979186495488","media_url_https":"https://pbs.twimg.com/media/GjJO7krXwAA_Pv9.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]},"medium":{"faces":[{"x":337,"y":145,"h":61,"w":61},{"x":417,"y":529,"h":80,"w":80}]},"small":{"faces":[{"x":191,"y":82,"h":34,"w":34},{"x":236,"y":300,"h":45,"w":45}]},"orig":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]}},"sizes":{"large":{"h":1306,"w":1718,"resize":"fit"},"medium":{"h":912,"w":1200,"resize":"fit"},"small":{"h":517,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1306,"width":1718,"focus_rects":[{"x":0,"y":0,"w":1718,"h":962},{"x":412,"y":0,"w":1306,"h":1306},{"x":572,"y":0,"w":1146,"h":1306},{"x":1065,"y":0,"w":653,"h":1306},{"x":0,"y":0,"w":1718,"h":1306}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657979186495488"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657885598875648","indices":[281,304],"media_key":"3_1887657885598875648","media_url_https":"https://pbs.twimg.com/media/GjJO2ICWAAAWosG.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1016,"w":1644,"resize":"fit"},"medium":{"h":742,"w":1200,"resize":"fit"},"small":{"h":420,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1016,"width":1644,"focus_rects":[{"x":0,"y":73,"w":1644,"h":921},{"x":271,"y":0,"w":1016,"h":1016},{"x":334,"y":0,"w":891,"h":1016},{"x":525,"y":0,"w":508,"h":1016},{"x":0,"y":0,"w":1644,"h":1016}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657885598875648"}}},{"display_url":"pic.x.com/Giybin4QP8","expanded_url":"https://x.com/rosstaylor90/status/1887659393237307459/photo/1","id_str":"1887657979186495488","indices":[281,304],"media_key":"3_1887657979186495488","media_url_https":"https://pbs.twimg.com/media/GjJO7krXwAA_Pv9.jpg","type":"photo","url":"https://t.co/Giybin4QP8","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]},"medium":{"faces":[{"x":337,"y":145,"h":61,"w":61},{"x":417,"y":529,"h":80,"w":80}]},"small":{"faces":[{"x":191,"y":82,"h":34,"w":34},{"x":236,"y":300,"h":45,"w":45}]},"orig":{"faces":[{"x":483,"y":209,"h":88,"w":88},{"x":598,"y":758,"h":115,"w":115}]}},"sizes":{"large":{"h":1306,"w":1718,"resize":"fit"},"medium":{"h":912,"w":1200,"resize":"fit"},"small":{"h":517,"w":680,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1306,"width":1718,"focus_rects":[{"x":0,"y":0,"w":1718,"h":962},{"x":412,"y":0,"w":1306,"h":1306},{"x":572,"y":0,"w":1146,"h":1306},{"x":1065,"y":0,"w":653,"h":1306},{"x":0,"y":0,"w":1718,"h":1306}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1887657979186495488"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1887659393237307459","view_count":41378,"bookmark_count":322,"created_at":1738888045000,"favorite_count":268,"quote_count":3,"reply_count":10,"retweet_count":22,"user_id_str":"524807755","conversation_id_str":"1887659393237307459","full_text":"The aha moment is indeed overrated. \n\nThe internal monologue style of R1 appears to come from SFT. The length scaling from R1-Zero appears to come from learning to verify after answering. These are two separate things!\n\nThis leads to a very different character of reasoning trace. R1-Zero looks like a final answer; R1-Zero looks like internal thought.\n\nExample 1: R1-Zero on the left; R1 on the right. R1-Zero looks like a typical CoT trace...R1 looks like o1-style branching (\"wait, no\").\n\nSo if R1-Zero looks like a regular CoT trace, then what is different? đ§”...","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,188],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1906836403096629507","quoted_status_permalink":{"url":"https://t.co/NZcQdwwdRE","expanded":"https://twitter.com/slow_developer/status/1906836403096629507","display":"x.com/slow_developerâŠ"},"retweeted":false,"fact_check":null,"id":"1906979663349428429","view_count":10914,"bookmark_count":7,"created_at":1743494356000,"favorite_count":182,"quote_count":2,"reply_count":10,"retweet_count":3,"user_id_str":"524807755","conversation_id_str":"1906979663349428429","full_text":"Sorry, but âwe have internal models that are betterâ public statements are very lame - especially when Microsoft doesnât have a track record of delivering LLMs that are anywhere near SoTA.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,160],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/sgUHsNQRXr","expanded_url":"https://x.com/rosstaylor90/status/1968355720820338894/photo/1","id_str":"1968355714306277376","indices":[161,184],"media_key":"3_1968355714306277376","media_url_https":"https://pbs.twimg.com/media/G1EBF-bWAAAOfrl.jpg","type":"photo","url":"https://t.co/sgUHsNQRXr","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1281,"w":991,"resize":"fit"},"medium":{"h":1200,"w":928,"resize":"fit"},"small":{"h":680,"w":526,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1281,"width":991,"focus_rects":[{"x":0,"y":0,"w":991,"h":555},{"x":0,"y":0,"w":991,"h":991},{"x":0,"y":0,"w":991,"h":1130},{"x":95,"y":0,"w":641,"h":1281},{"x":0,"y":0,"w":991,"h":1281}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1968355714306277376"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/sgUHsNQRXr","expanded_url":"https://x.com/rosstaylor90/status/1968355720820338894/photo/1","id_str":"1968355714306277376","indices":[161,184],"media_key":"3_1968355714306277376","media_url_https":"https://pbs.twimg.com/media/G1EBF-bWAAAOfrl.jpg","type":"photo","url":"https://t.co/sgUHsNQRXr","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":1281,"w":991,"resize":"fit"},"medium":{"h":1200,"w":928,"resize":"fit"},"small":{"h":680,"w":526,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":1281,"width":991,"focus_rects":[{"x":0,"y":0,"w":991,"h":555},{"x":0,"y":0,"w":991,"h":991},{"x":0,"y":0,"w":991,"h":1130},{"x":95,"y":0,"w":641,"h":1281},{"x":0,"y":0,"w":991,"h":1281}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1968355714306277376"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1968355720820338894","view_count":160087,"bookmark_count":860,"created_at":1758127548000,"favorite_count":903,"quote_count":20,"reply_count":10,"retweet_count":147,"user_id_str":"524807755","conversation_id_str":"1968355720820338894","full_text":"Supplementary information for the new DeepSeek R1 Nature paper is very interesting!\n\nDetails on training data, hyperparameters, base model importance, and more. https://t.co/sgUHsNQRXr","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,274],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1881318142083018951","quoted_status_permalink":{"url":"https://t.co/mNXfj5pscU","expanded":"https://twitter.com/deepseek_ai/status/1881318142083018951","display":"x.com/deepseek_ai/stâŠ"},"retweeted":false,"fact_check":null,"id":"1881372810485899716","view_count":39247,"bookmark_count":154,"created_at":1737389206000,"favorite_count":275,"quote_count":1,"reply_count":9,"retweet_count":20,"user_id_str":"524807755","conversation_id_str":"1881372810485899716","full_text":"R1 paper and work truly excellent. Questions about the reward shaping + training data:\n\n1. The paper says the reward âmainly consists of accuracy and format rewardsâ. Is this it or do they have any other rewards to incentivise longer traces?\n\n2. How many unique (verifiable) questions do they need for things to kick into gear in the RL stage?\n\nBut I am very glad they have helped put to bed PRMs, MCTS and all that excess complexity! đȘŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,276],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","retweeted":false,"fact_check":null,"id":"1976397439340101698","view_count":28072,"bookmark_count":209,"created_at":1760044843000,"favorite_count":276,"quote_count":3,"reply_count":9,"retweet_count":30,"user_id_str":"524807755","conversation_id_str":"1976397439340101698","full_text":"RL is not enough. It only reaches its potential when combined with other ideas.\n\nThe most famous example is AlphaZero. RL was combined with self-play which created an implicit task curriculum that evolved through training. This is very different from many RL datasets for LLMs which have a fixed set of tasks.\n\nEven where the task set is fixed, RL still needs to be combined with other ideas to show signs of life. Thinking models only come to life through RL when we remove length penalisation and have enough prior knowledge in the training data.\n\nLooking ahead, long horizon tasks will require much more exploration and âgoing off-pisteâ. But current RL methods induce policy entropy collapse. Diverse mid-training before RL could help, but fundamentally current RL objectives donât reward âinterestingnessâ and deviating from the âcurrent (high reward) thingâ.\n\nAnd yet, discovery is all about deviating from the current thing - and the best ideas come from the wilderness. Deep learning is no exception. It was interesting long before it reached its true potential. We didnât wait to introduce backprop and SGD until compute and data came online :).\n\nThis is a long way of saying: crude RL maximalism is overhyped. The magic comes from the interplay of RL with other things, and the angels are in the details.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,278],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1977594077706420339","quoted_status_permalink":{"url":"https://t.co/h0yqKXZ8YS","expanded":"https://twitter.com/harjtaggar/status/1977594077706420339","display":"x.com/harjtaggar/staâŠ"},"retweeted":false,"fact_check":null,"id":"1977641594619609405","view_count":17519,"bookmark_count":21,"created_at":1760341473000,"favorite_count":179,"quote_count":0,"reply_count":9,"retweet_count":11,"user_id_str":"524807755","conversation_id_str":"1977641594619609405","full_text":"Most of the folks who were early on AI cared about AI safety from the outset.\n\nMost of the anti AI safety takes I see are from folks who only got involved after ChatGPT - once the technology had been de-risked by others.\n\nSo revision:\n\nâThose who take AI seriously care about AI safety; those who only care about short-term profits donâtâ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1911143014258405420","quoted_status_permalink":{"url":"https://t.co/4cug1QpXsP","expanded":"https://twitter.com/wenhuchen/status/1911143014258405420","display":"x.com/wenhuchen/statâŠ"},"retweeted":false,"fact_check":null,"id":"1911332863577928008","view_count":25136,"bookmark_count":164,"created_at":1744532240000,"favorite_count":204,"quote_count":1,"reply_count":8,"retweet_count":24,"user_id_str":"524807755","conversation_id_str":"1911332863577928008","full_text":"Distillation outperforms RL recipes at smaller scales.\n\nOne simple way to think about this is a large model is a better way to discover good data than a smaller one. A point made in original DeepSeek-R1 paper, but missed by many in the wake of that release in the GRPO wave.\n\nThat being said, thereâs still an opportunity to do effective RL for smaller models. But hinges on getting details right to make training more efficient, including filtering the difficulty of the RL corpus and several algorithmic tweaks (eg overlong filtering).\n\nI also think the community is overoptimising benchmarks like AIME, MATH500 and GPQA. Itâs not very interesting to report a model doing well on just these benchmarks anymore. What matters is if the recipe generalises to more benchmarks that donât have heavy optimisation pressure applied to them.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}],"activities":{"nreplies":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":0,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":1,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":1,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":0,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":1,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":5,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nbookmarks":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":0,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":6,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":32,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":2,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":2,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":19,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nretweets":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":1,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":1,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":0,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":0,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":1,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":6,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nlikes":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":27,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":21,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":42,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":7,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":10,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":2,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":105,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nviews":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":1763,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":5200,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":11009,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":1531,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":4453,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":1117,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":25612,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}]},"interactions":{"users":[{"created_at":1491783070000,"uid":"851226008770547713","id":"851226008770547713","screen_name":"felix_drost","name":"Felix Drost","friends_count":469,"followers_count":578,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1733182141654056961/BmkenBC7_normal.jpg","description":"Armchair O7 Once an anthropologist & soldier.\n\nđȘđșđłđ±","entities":{"description":{"urls":[]}},"interactions":1},{"created_at":1721463779000,"uid":"1814576577637666816","id":"1814576577637666816","screen_name":"halfatheist","name":"HalfAtheist","friends_count":60,"followers_count":254,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1862498254971445255/jTmnS-nM_normal.jpg","description":"Truth-seeker. Can handle nuance.\n\nđč $BMNR $KDK $JOBY","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"linktr.ee/halfatheist","expanded_url":"https://linktr.ee/halfatheist","url":"https://t.co/Mq1GyIrc8y","indices":[0,23]}]}},"interactions":1},{"created_at":1591728953000,"uid":"1270429378246213632","id":"1270429378246213632","screen_name":"joythw","name":"Tom Joy","friends_count":191,"followers_count":190,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1834987722270031872/qGDfPsc0_normal.jpg","description":"Building something new.\nGirlsWhoML co-founder @GirlsWhoML. \nPhD in AI @OxfordTVG. \nPrev Research Scientist @_FiveAI, @Meta, @SLAMcoreLtd\nhe/him","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"thwjoy.github.io","expanded_url":"http://thwjoy.github.io","url":"https://t.co/Cl5sD0zwcd","indices":[0,23]}]}},"interactions":1}],"period":14,"start":1761805103584,"end":1763014703584},"interactions_updated":1763014703759,"created":1763014703478,"updated":1763014703759,"type":"the thought leader","hits":1},"people":[{"user":{"id":"1348109832792940544","name":"äžæ èźș","description":"äžè±äžäžçïŒäžæ äžè©æ","followers_count":961,"friends_count":452,"statuses_count":2022,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1976482849432772608/7DSy3SuM_normal.jpg","screen_name":"coderliyi","location":"Beijing","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"yishulun.com","expanded_url":"https://yishulun.com","url":"https://t.co/f0B4P1rlky","indices":[0,23]}]}}},"details":{"type":"The Thought Leader","description":"äžæ èźș is a deep thinker who blends philosophy and practical wisdom, inspiring followers with thoughtful reflections and insightful knowledge sharing. Their tweets reveal a quest for self-awareness, courage, and meaningful pursuits, making philosophy accessible and relatable. With a calm and contemplative tone, they engage their audience on significant life concepts and cultural insights.","purpose":"To illuminate paths of self-discovery and encourage mindful decision-making through sharing knowledge and promoting intellectual growth.","beliefs":"They believe in the power of self-reflection, continuous learning, and courage in everyday choices; that truth and meaningful work lie at the intersection of passion, skill, and value.","facts":"Fun fact: They have tweeted over 2,000 times, showing a consistent and dedicated commitment to sharing insights and learning with their audience.","strength":"Their greatest strength is the ability to convey complex ideas simply and thoughtfully, fostering deep engagement and reflection among followers.","weakness":"A more contemplative style and lower interaction rates might make it harder to rapidly grow their audience or attract casual followers seeking quick entertainment.","recommendation":"To grow their audience on X, äžæ èźș should incorporate more interactive content such as questions or polls, and occasionally share bite-sized takeaways from their dense reflections to boost engagement and reach.","roast":"You tweet so philosophically that even Socrates would need a coffee break to keep upâand your followers just hope you don't start asking them about the meaning of their lunch.","win":"Consistently engaging audience with thoughtful, in-depth reflections, especially through mini book reviews and personal development insights, carving a niche as a mindful, intellectual voice."},"created":1763020312117,"type":"the thought leader","id":"coderliyi"},{"user":{"id":"1832714268698906624","name":"HoneyoutofDeRock","description":"HE MADE HIM RIDE ON THE HIGH PLACES OF THE EARTH, THAT HE MIGHT EAT THE INCREASE OF THE FIELDS; AND HE MADE HIM TO SUCH HONEY OUT OF THE ROCK,,DEUT 32; 13","followers_count":8201,"friends_count":7837,"statuses_count":67079,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1832730357868744704/1j1eo9Er_normal.jpg","screen_name":"Honey4rmDErock","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"HoneyoutofDeRock is a spiritually focused Thought Leader sharing powerful biblical insights and motivational guidance with a deep passion for faith and resilience. Their tweets mix scriptural wisdom with everyday practical advice to inspire and uplift a dedicated community. Constantly engaging on profound topics, they foster thoughtful reflection amid a chaotic digital environment.","purpose":"To enlighten and empower others through biblical truth and spiritual wisdom, helping followers navigate life's challenges with faith, prayer, and moral clarity.","beliefs":"HoneyoutofDeRock values divine guidance, integrity, and the power of prayer. They believe in vigilance against deception, the potency of spiritual warfare, and the necessity of sincere relationship with God to experience true transformation.","facts":"Fun fact: With over 67,000 tweets, HoneyoutofDeRock has established an extensively active presence highlighting the perseverance required to maintain a consistent voice in faith-based discourse online.","strength":"Their strength lies in delivering spiritually rich content that resonates deeply with believers, fostering community discussion and spiritual growth through thoughtful engagement and relatable messaging.","weakness":"However, their heavy focus on serious, sometimes intense spiritual themes may limit broader mainstream appeal or engagement beyond their niche audience.","roast":"HoneyoutofDeRock tweets more prayers than people have fingers to count â if only their WiFi signal was as strong as their faith, weâd get divine retweets from heaven itself!","win":"Their biggest win is creating a large, engaged audience devoted to spiritual encouragement and biblical truth, turning a Twitter handle into a resurrection of hope for many followers.","recommendation":"To grow their audience on X, HoneyoutofDeRock should introduce more interactive content like Q&A threads or polls about faith challenges, and leverage trending hashtags to blend timeless wisdom with current conversations, enhancing visibility beyond their core group."},"created":1763019820250,"type":"the thought leader","id":"honey4rmderock"},{"user":{"id":"1715171983942692864","name":"Klstina","description":"","followers_count":1903,"friends_count":476,"statuses_count":7838,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1715261489257615360/TRSjBxa3_normal.jpg","screen_name":"kelisititan","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Klstina navigates the complexities of life with deep reflection and poetic insight. Their tweets resonate with themes of realism, resilience, and the acceptance of imperfection. With a thoughtful voice that invites followers to ponder life's challenges, Klstina inspires growth through vulnerability.","purpose":"To challenge and inspire others by sharing sincere reflections on personal struggles, resilience, and the imperfect journey of life. Klstina aims to help people embrace their flaws and confront reality with courage and wisdom.","beliefs":"Klstina believes that true maturity is about facing oneâs shortcomings rather than chasing perfection. They hold that lifeâs essence lies in accepting failure as the dominant theme and cultivating the strength to rise above it. They also recognize the limitations imposed by societal inequalities and the harsh truths of reality.","facts":"Klstina has tweeted over 7,800 times, showing a dedicated commitment to sharing their perspective. Despite not focusing on follower count, their tweets gather views in the 10k+ range, indicating meaningful reach and engagement through thoughtful content.","strength":"Klstinaâs greatest strength is their authentic and introspective voice that creates a meaningful emotional connection with their audience. Their content encourages deep thought and self-reflection, which can foster a loyal and engaged community.","weakness":"Their contemplative style tends to attract more replies than retweets or likes, suggesting that while their audience resonates emotionally, the content might be less likely to be widely shared or go viral. Additionally, the introspective and sometimes melancholic tone may limit broader appeal.","recommendation":"To grow their audience on X, Klstina could blend their profound reflections with more interactive content such as questions or calls-for-opinion to increase engagement. Incorporating occasional lighter or uplifting tweets might balance the tone and broaden appeal while maintaining their authentic voice.","roast":"Klstina probably has a PhD in Overthinking with a minor in Melancholy, perfecting the art of turning a simple day into an existential saga worthy of a philosophical novel no one asked to read... but you canât help scrolling anyway.","win":"Klstinaâs biggest win is their ability to consistently create deeply reflective and thought-provoking content that sparks meaningful conversations among their followers about lifeâs imperfections and realities."},"created":1763019444256,"type":"the thought leader","id":"kelisititan"},{"user":{"id":"1486291891255746561","name":"A đŠđȘ","description":"To the moon and never back.","followers_count":8864,"friends_count":2798,"statuses_count":155640,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1760913834540232704/0m5_mVW3_normal.jpg","screen_name":"tutututuulip","location":"Dubai, United Arab Emirates","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"A đŠđȘ is a provocative thinker who dives deep into social dynamics and emotional truths, sparking meaningful conversations with every tweet. With a relentless commitment to sharing insights on maturity, mental health, and relationships, they challenge conventional wisdom like a true digital philosopher. Their voice resonates because it is both raw and thought-provoking, creating a magnetic pull in the social media sphere.","purpose":"To illuminate unseen truths about human behavior and relationships, encouraging followers to reflect critically and emotionally on their own lives and societal norms. A đŠđȘ aims to foster awareness and inspire personal growth by using insight as a catalyst for change.","beliefs":"They believe in the power of vulnerability and unfiltered honesty, that emotional intelligence is as vital as intellect, and that societal standards often stifle individual expressionâespecially regarding gender roles and mental health. Elevating others and insisting on emotional urgency are core values that guide their worldview.","facts":"Despite tweeting over 155,000 times, A đŠđȘ maintains a strong, consistent voice that garners millions of views and tens of thousands of likes, proving quantity doesnât dilute quality when paired with genuine insight.","strength":"Unmatched consistency and volume of content combined with emotionally resonant themes. They masterfully connect with complex topics such as functional depression and relationship dynamics, positioning themselves as a go-to voice on social and emotional intelligence.","weakness":"Their high volume of tweets could overwhelm new followers, and the repetitive thematic focus might limit broader appeal or subject diversity, risking engagement fatigue among some audience segments.","recommendation":"To grow their audience on X, A đŠđȘ should consider curating themed thread series that dive deeper into their top topics, enhancing discoverability and follower retention. Engaging more via replies and collaborations with thought leaders in mental health and social commentary could amplify impact and diversify audience demographics.","roast":"A đŠđȘ tweets so much, if impressions were a currency, you'd be wealthier than Elon Muskâyet somehow, you still find time to remind men they need a sense of urgency, like the Twitter version of a car horn that just won't quit honking.","win":"Achieved viral influence with multiple tweets crossing the million-view mark, sparking widespread discussion on taboo topics like functional depression and gender expectations, marking them as a distinctive and impactful voice on X."},"created":1763018305870,"type":"the thought leader","id":"tutututuulip"},{"user":{"id":"1216829672","name":"Misher","description":"Cryptic đ§ââïž","followers_count":1842,"friends_count":1475,"statuses_count":25971,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1977474423100764160/99OwulWG_normal.jpg","screen_name":"misher_4","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Misher is a beacon of deep insight and intellectual clarity, diving into complex cryptographic innovations with poetic finesse. They explore and explain Fully Homomorphic Encryption, bridging technical mastery and accessible storytelling. Misher pioneers a vision where privacy and computation harmoniously coexist, inspiring curiosity and trust.","purpose":"To illuminate the future of data privacy and security through knowledge sharing, pushing the boundaries of cryptographic trust, and fostering a community that values cutting-edge innovation fused with ethical transparency.","beliefs":"Misher believes in the fundamental right to privacy and that technology should empower individuals without compromising trust or security. They hold that complex ideas can be elegantly communicated, and true transparency need not sacrifice confidentiality.","facts":"Fun fact: Misher has tweeted nearly 26,000 times, with a strong focus on the revolutionary potential of Fully Homomorphic Encryption and its real-world impact on blockchain, AI, and privacy â making cryptography poetry in motion.","strength":"Misher's greatest strength lies in their ability to demystify highly complex tech topics with clarity and creativity, appealing to both experts and curious followers. Their deep knowledge combined with engaging narrative style drives impactful conversations.","weakness":"Their content, while intellectually rich, can sometimes feel too niche or dense for casual audiences, potentially limiting broader mainstream engagement and slowing follower growth beyond specialized communities.","recommendation":"To grow their audience on X, Misher should blend their in-depth technical posts with more bite-sized, relatable explanations and visuals. Engaging in conversations with wider crypto and tech influencers, using trending hashtags, and occasionally injecting humor or personal stories would amplify reach and accessibility.","roast":"For someone who tweets nearly 26,000 times, Misher's social media game is like a high-security vaultâimpenetrable and cryptic. What they lack in follower count, they make up for by being the ultimate digital Sphinxâmysterious, puzzling, and leaving everyone wondering if they accidentally locked themselves out of their own audience.","win":"Misherâs biggest win is establishing themselves as a respected voice on Fully Homomorphic Encryption, translating groundbreaking cryptographic breakthroughs into poetic narratives that carve a unique space at the intersection of tech innovation and storytelling."},"created":1763016293693,"type":"the thought leader","id":"misher_4"},{"user":{"id":"1980265166366457856","name":"meng wang","description":"đ ç±ææŻïŒç±è”é±ïŒç±AIćäș«ïŒç±ććČïŒ\n\nđŹ ćäș« è”é±éĄčçź ïœæç ŽäżĄæŻćŁć ïœćŻäžéżć\n\nđ» èœæè§Łçè”é±éĄčçźæææäč\n\nđć„œç”ćœ±ćäș«ăć„œäčŠćäș«","followers_count":28,"friends_count":54,"statuses_count":108,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1982743171151908865/LEpQRNv-_normal.jpg","screen_name":"mengwang1698251","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Meng Wang is a reflective and insightful thinker who combines a love for technology, AI, and history with a passion for sharing meaningful money-making projects and thoughtful cultural content. They delve deep into societal narratives, uncovering uncomfortable truths through literary analysis and exploring the human condition in fast-changing times. Meng bridges intellectual wealth and practical advice, offering followers a unique blend of wisdom and actionable insights.","purpose":"Meng's life purpose revolves around breaking down complex economic and social phenomena to empower others with authentic knowledge, helping them navigate the pitfalls of side hustles and embrace a more meaningful approach to wealth and wisdom. By sharing dissected, reliable money-making projects and cultural wisdom, they aim to inspire critical thinking and emotional resonance in a world lost to superficial distractions.","beliefs":"Meng believes in the power of truth, sincerity, and intellectual depth, valuing principles like kindness, integrity, and respect in a society that often rewards cunning over goodness. They hold a somewhat somber view of modern social dynamics but also trust that genuine understanding and awareness can push back against the cynicism and fractured values of contemporary life.","facts":"Meng uniquely blends a passion for AI tech and lucrative side projects with profound literary critiques, especially of works like Yu Huaâs 'Brothers,' using literature as a lens to examine China's social evolutions and personal struggles.","strength":"Meng's greatest strength lies in their ability to interpret complex social and emotional realities and translate them into engaging, relatable content that encourages deep reflection while still being practical. Their analytical mindset and empathy allow them to connect with audiences on multiple levels, bridging emotional depth with strategic advice on monetization.","weakness":"Their contemplative and sometimes somber tone might limit broader mass appeal on fast-paced social platforms, potentially making their content less shareable in short bursts. Also, relatively low tweet frequency and follower interaction suggest room for more consistent engagement to build momentum and reach.","recommendation":"To grow their audience on X, Meng should increase tweet frequency and mix in shorter, punchier posts that summarize key insights from their longer reflections. Engaging more with followers through replies, polls, and calls-to-action will spark community interaction. Leveraging hashtags around AI, side hustles, and Chinese literature could also attract niche followers who share these interests.","roast":"Meng, youâre the kind of person who could turn a simple tweet about AI into a tearjerker analysis about the tragedy of good men in a ruthless worldâbasically making your followers need a tissue just to check their notifications. At this rate, your followers arenât sure if theyâre here to hustle or enroll in a philosophy course.","win":"Meng has achieved a meaningful impact by seamlessly blending cultural critique and practical advice, notably through insightful literary deconstructions that resonate emotionally while providing real-world relevanceâturning abstract social critiques into relatable, actionable wisdom for their audience."},"created":1763014917828,"type":"the thought leader","id":"mengwang1698251"},{"user":{"id":"1857719550508412928","name":"Piyush đ","description":"MBBS, MS ENT | Citations First | đȘ Articles â Threads: #ENTwithPiyush | #BooksWithPiyush #AIwithPiyush Nerd. Gamer. Keeping up with the LLMs","followers_count":1448,"friends_count":3300,"statuses_count":22171,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1981637681349087232/9hLnDYSE_normal.jpg","screen_name":"drpiyushENT","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Piyush is a medical maven and AI enthusiast blending clinical expertise with digital savvy. He educates his community through detailed threads on ENT health, artificial intelligence, and books, sprinkled with a gamerâs curiosity. His high tweet volume reflects a passion for sharing knowledge and sparking thoughtful discussions.","purpose":"To elevate public understanding of ENT health and innovative AI technologies by providing expert insights and practical advice that empower people to take control of their well-being and tech literacy.","beliefs":"Piyush values education, critical thinking, and the responsible integration of AI with medicine. He believes knowledge should be accessible and that proactive health choicesâespecially regarding preventable conditionsâcan save lives. He also champions staying curious and adaptable in an ever-evolving digital landscape.","facts":"Fun fact: Piyush has tweeted over 22,000 times, transforming his X timeline into a rich trove of medical pearls and AI wisdom â a go-to goldmine for #ENTwithPiyush followers!","strength":"His strengths lie in blending authoritative medical knowledge with approachable, engaging content, making complex topics digestible. His consistent posting frequency keeps his audience engaged and positions him as a go-to expert in his niche.","weakness":"With such a prolific output, thereâs a risk of follower fatigue or diluted message clarity. Being highly specialized might also limit appeal to a broader, less medically inclined audience.","recommendation":"To grow his audience on X, Piyush should leverage visual storytelling with concise video snippets explaining common ENT issues and AI trends, optimize threads with trending hashtags, and engage personally by responding promptly to questions. Collaborating with influencers in health and tech could also broaden his reach.","roast":"For someone whoâs practically a walking medical encyclopedia, Piyush must have enough tweets to fill a textbook thicker than his patientâs MRI scansâproving that when he talks, the keyboard barely gets a break!","win":"Successfully established himself as a respected voice in the ENT and AI communities on X, creating valuable educational content that combines clinical precision with modern tech insights."},"created":1763014801119,"type":"the thought leader","id":"drpiyushent"},{"user":{"id":"1890828578641907712","name":"Web3 Ideation","description":"Building the Purpose Economy â first Marketplace for Web3 Functionalities | Launch: 31/01/26\n\nDiscover @Web3ideation and the Web3 Innovation Lab đ","followers_count":293,"friends_count":171,"statuses_count":2729,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1891542544905027584/nU37soCI_normal.jpg","screen_name":"web3ideation","location":"Worldwide","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"web3ideation.com","expanded_url":"https://web3ideation.com","url":"https://t.co/NFvFwQ62pD","indices":[0,23]}]}}},"details":{"type":"The Thought Leader","description":"Web3 Ideation is a visionary voice in the emerging Web3 space, emphasizing strategic thinking over hype and focusing on building sustainable ecosystems rather than quick startups. They provide insightful guidance on how founders can navigate the complexities of decentralized technology with purpose and clarity. Their content moves beyond buzzwords to concrete advice about ownership, incentives, and community-building in the Purpose Economy.","purpose":"To redefine the future of the internet by fostering genuine ownership and incentive-aligned ecosystems, empowering founders to build lasting, purpose-driven Web3 projects.","beliefs":"Web3 Ideation believes that success in Web3 isnât about disruption for its own sake but a thoughtful redesign of systems where transparency, integrity, and community momentum are paramount. They trust that true value lies in real problems and clear strategies rather than hype or superficial trends.","facts":"Fun fact: Web3 Ideation has tweeted over 2,700 times, demonstrating a commitment to consistently share knowledge and build their presence while guiding the Web3 community through thoughtful strategy.","strength":"Their strength lies in delivering clear, strategic insights that challenge popular myths in the Web3 space, focusing on ecosystem-building and sustainable growth that resonates with thoughtful founders and investors.","weakness":"A potential weakness is that the high-level strategic focus might come across as abstract or overly complex to newcomers craving simpler, immediate takeaways or tangible product features.","roast":"Web3 Ideation talks so much about building systems, sometimes we wonder if their tweets are more about launching platforms for thoughts than actual productsâmaybe their next hackathon should be about launching a tweet that actually goes viral.","win":"Successfully establishing themselves as a credible advisory figure in Web3, influencing founders to prioritize strategy over hype, and fostering a mindset shift towards building purpose-driven ecosystems that can attract serious investment and long-term user engagement.","recommendation":"To grow their audience on X, Web3 Ideation should incorporate more engaging, bite-sized content such as quick case studies, founder spotlights, or myth-busting threads that simplify complex concepts while leveraging conversational tone to invite more replies and retweets, thus boosting visibility and community interaction."},"created":1763014663709,"type":"the thought leader","id":"web3ideation"},{"user":{"id":"914138585355259904","name":"ćžćèK | Birdđïž","description":"äžçș§ćžéè
| äșçș§èéè | BTC俥仰è
","followers_count":1850,"friends_count":1110,"statuses_count":3006,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1973325931537420288/Cdd2Dnrp_normal.jpg","screen_name":"Btc_K88","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"ćžćèK | Birdđïž is a passionate evangelist and BTC believer who dives deep into the intricate world of AI-driven crypto projects and DeFi innovations. He bridges complex technical concepts with real human impact, fostering understanding in an often perplexing space. His tweets reveal a careful analyst who values substance over hype, consistently spotlighting builders and practical developments.","purpose":"His life purpose is to enlighten and educate crypto enthusiasts and the broader community about the transformative potential of AI and decentralization in the blockchain space. By clarifying emerging technologies and promoting thoughtful discourse, he aims to nurture an informed and engaged community that drives meaningful innovation.","beliefs":"He believes in integrity over hype, the power of communities building together, and that decentralized AI and DeFi solutions must empower ordinary users while respecting the core values of trustlessness and transparency. He trusts technology to augment human capability but sees the human elementâthe builders and usersâas central to lasting impact.","facts":"Fun fact: ćžćèK describes himself humorously as a 'äžçș§ćžéè
' (primary evangelist) but also a 'äșçș§èéè' (seasoned retail trader whoâs faced market ups and downs), showing his humility amid his deep knowledge.","strength":"He excels at breaking down complex technical developments into relatable narratives, fostering community trust by highlighting real projects over speculation. His consistent engagement and thoughtful content create a unique credibility and connection with followers.","weakness":"Sometimes his detailed, richly nuanced posts may fly over the heads of casual followers or fail to capture the fleeting attention spans of social media audiences looking for quick takes. This can limit his growth potential without strategic simplification or punchier content.","recommendation":"To grow his audience on X, ćžćèK should mix his in-depth analysis with more digestible, tweet-thread stories and engaging visuals that simplify concepts for newcomers. Collaborating with other influencers, hosting live Q&A sessions, and using trending crypto hashtags can amplify his reach while maintaining his thought leadership status.","roast":"For a guy who calls himself a primary evangelist and a âseasoned old leekâ, you sure spend a lot of time schooling folks about how not to get chopped up. Maybe youâre slowly converting all those âleeksâ into finely aged crypto cheddarâeither that or your followers get a free masterclass in how to survive token tsunamis without flipping out!","win":"Successfully cultivating a respectful and insightful hub for AI and DeFi discussions on X, ćžćèK has built a community that values thoughtful critique and real-world project progress over hype and noise."},"created":1763014114318,"type":"the thought leader","id":"btc_k88"},{"user":{"id":"715714380227149825","name":"Namos","description":"⊠ć
æèè
· çæŽ»ćźè·” · ćŒæŸæąçŽą âŠ ćšæéżäžäżźèĄ · ćšćČæäžè§ç„ · ćšäșșæșć
±çäžæäžșèȘć·± ïœ MetaThinker · Life Praxis · Exploration ⊠Practice · Awareness · Becoming with AI","followers_count":175,"friends_count":423,"statuses_count":355,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1958821928036151296/Np0Fj1zL_normal.jpg","screen_name":"akce173","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Namos is a reflective and insightful thinker who navigates the intersection of AI, education, and human experience with a thoughtful and philosophical lens. Their content elevates awareness around AI's societal impact and advocates for evolving mindsets and education systems to keep pace with technological change. They inspire followers to embrace exploration, deep thinking, and unique individuality in the AI era.","fun_fact":"Namos uniquely blends ancient wisdom and philosophy with cutting-edge AI topics, calling themselves a 'MetaThinker' who practices 'Life Praxis' â a rare combo of deep thought and practical application in the AI age.","purpose":"To awaken and guide individuals towards self-awareness and meaningful adaptation in a rapidly evolving AI-driven world, fostering a mindset that transcends outdated educational models and leverages AI as a tool to amplify oneâs unique human essence.","beliefs":"They believe that true competitive advantage stems not from merely mastering AI tools, but from cultivating deep personal experience, unique values, and original thought models that AI amplifies rather than replaces. They hold that education must evolve beyond industrial-era paradigms to nurture versatile âexplorersâ capable of mastering AIâs challenges.","strength":"Namos excels at delivering profound, balanced, and reflective insights that blend philosophy with practical AI implications, positioning them as a nuanced, trusted voice bridging technology and human experience. Their ability to challenge conventional wisdom and inspire new frameworks is a notable strength.","weakness":"Their content might lean heavily into abstract or philosophical discourse, which could potentially alienate followers seeking quick, straightforward AI tips or actionable hacks. This depth may limit mass appeal on a fast-moving platform like X.","recommendation":"To grow their audience on X, Namos should consider supplementing their deep thematic threads with bite-sized, relatable takeaways or practical AI insights that invite engagement. Collaborating with influencers in both tech and philosophy niches could also expand reach by connecting diverse communities who appreciate thoughtful AI dialogue.","roast":"Namosâ tweets are so intellectually deep, they probably have to remind their audience to breathe and blink â or risk becoming part of an AI experiment on attention span endurance. Maybe sprinkle some memes in there to keep the philosophizing from turning into a digital existential crisis!","win":"Successfully shifts the conversation around AI from mere technological hype to essential philosophical and educational reform, establishing themselves as a pioneering voice in 'meta-thinking' about AIâs role in human evolution."},"created":1763013868751,"type":"the thought leader","id":"akce173"},{"user":{"id":"1241982596382224384","name":"Thought Harbor","description":"Digital Content Creator đ»đ„","followers_count":5682,"friends_count":4103,"statuses_count":104431,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1834992239224803329/ViFcGCLZ_normal.jpg","screen_name":"thought_harbor","location":"","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"alzayainterior.com","expanded_url":"http://www.alzayainterior.com","url":"https://t.co/4efJ8mFd3c","indices":[0,23]}]}}},"details":{"type":"The Thought Leader","description":"Thought Harbor is a prolific digital content creator whose wisdom and insights spark meaningful conversations across the digital sea. With over 104,000 tweets, they are a relentless source of motivation and thoughtful advice, engaging a community hungry for growth. Their content encourages resilience, lifelong learning, and self-awareness, making them a guiding light for many navigating the complexities of life.","purpose":"Their life purpose is to inspire and empower others through thoughtful reflections that challenge conventional thinking and promote personal growth. They aim to cultivate a mindset where strength, loyalty, and peace of mind become foundational to a fulfilled life.","beliefs":"Thought Harbor believes in the power of mental strength over fleeting emotions, the importance of continuous learning, and the value of maintaining inner peace regardless of external chaos. They hold loyalty as a virtue but also advocate for wisdom in its practice, showing a balanced and grounded perspective.","facts":"Fun fact: Despite their digital presence being vast with over 104,000 tweets, Thought Harbor manages to keep a consistent tone of encouragement and insight, which is rare in such high-volume posting accounts.","strength":"Their biggest strength is their relentless consistency combined with the ability to distill complex life lessons into short, impactful messages that resonate deeply with their audience.","weakness":"An apparent weakness could be the sheer volume of tweets, which might overwhelm followers or dilute the impact of individual messages; their engagement shows a lot of replies but relatively few retweets, hinting at a more debate-driven than viral spread.","roast":"With 104,431 tweets, Thought Harbor is basically the digital equivalent of that one friend who never stops talking at parties â except here, their âtalkingâ is wisdom-packed tweets, so we let it slide.","win":"Their biggest win is cultivating a devoted, interactive community that values thoughtful discourse, evidenced by thousands of likes and replies on uplifting and motivational posts.","recommendation":"To grow their audience on X, Thought Harbor should focus on sparking more shareable content like threads or bite-sized challenges that invite retweets and broaden reach, while pacing their high-frequency tweeting to avoid follower fatigue."},"created":1763013177335,"type":"the thought leader","id":"thought_harbor"},{"user":{"id":"1940714646698446850","name":"oog","description":"oog watch. oog learn. oog share. đđ","followers_count":2275,"friends_count":245,"statuses_count":10505,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1981729535893630976/ZXzASVHW_normal.jpg","screen_name":"oog84__","location":"","entities":{"description":{"urls":[]}}},"details":{"type":"The Thought Leader","description":"Oog is a wise and soulful guide, blending simple truths with deep reflections that resonate on a human level. With a penchant for vulnerability and authenticity, Oog creates a tribe that finds comfort and insight in their candid musings about life, growth, and emotional resilience. Their style is heartfelt, poetic, and often sprinkled with a unique playful voice that sticks with you.","purpose":"Oogâs life purpose centers on guiding others through emotional self-awareness and empowerment by embracing vulnerability, cultivating resilience, and sharing wisdom from lived experience. They aim to nurture a community where people feel seen, loved, and inspired to transform their struggles into strength.","beliefs":"Oog believes in the healing power of nature, self-reliance, and emotional honesty. They value growth that comes from action and skill development instead of fleeting inspiration, and cherish the idea that connection is built through authenticity and shared vulnerability.","facts":"Fun fact: Oogâs unique signature style â communicating through short, almost primal sentences and playful self-reference â makes their messages feel like warm fireside talks with a very old soul.","strength":"Oogâs strength lies in their authentic storytelling and ability to create a safe emotional space where difficult feelings can be explored and normalized. Their poetic simplicity and consistent sharing build a loyal, engaged community.","weakness":"Oog may sometimes get lost in their own introspective world, which could make their content less accessible to broader audiences who crave straightforward advice or humor. Their minimalistic style might also limit their appeal in highly fast-paced or flashy social media environments.","recommendation":"To grow their audience on X, Oog should continue leveraging their emotional depth but also experiment with more interactive content like polls, threads unpacking bigger ideas, or short video reflections. Highlighting community voices and encouraging user-generated content around shared emotions will deepen connection and extend reach.","roast":"Oogâs messages are so wrapped in existential forest wisdom that you half expect a talking squirrel to show up and start tweeting right alongside them â just donât be surprised if that squirrel judges your life choices harder than you do yourself.","win":"Building a tribe of 2,000 engaged souls purely through heartfelt, raw reflections on lifeâs beauty and pain is a remarkable achievement that most social media veterans only dream of."},"created":1763011091667,"type":"the thought leader","id":"oog84__"}],"activities":{"nreplies":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":0,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":1,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":1,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":0,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":1,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":5,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nbookmarks":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":0,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":6,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":32,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":2,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":2,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":19,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nretweets":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":1,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":1,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":0,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":0,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":1,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":0,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":6,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nlikes":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":27,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":21,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":42,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":7,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":10,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":2,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":105,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}],"nviews":[{"label":"2025-10-15","value":0,"startTime":1760400000000,"endTime":1760486400000,"tweets":[]},{"label":"2025-10-16","value":0,"startTime":1760486400000,"endTime":1760572800000,"tweets":[]},{"label":"2025-10-17","value":0,"startTime":1760572800000,"endTime":1760659200000,"tweets":[]},{"label":"2025-10-18","value":0,"startTime":1760659200000,"endTime":1760745600000,"tweets":[]},{"label":"2025-10-19","value":0,"startTime":1760745600000,"endTime":1760832000000,"tweets":[]},{"label":"2025-10-20","value":0,"startTime":1760832000000,"endTime":1760918400000,"tweets":[]},{"label":"2025-10-21","value":1763,"startTime":1760918400000,"endTime":1761004800000,"tweets":[{"bookmarked":false,"display_text_range":[13,64],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"2939913921","name":"Nathan Lambert","screen_name":"natolambert","indices":[0,12]}]},"favorited":false,"in_reply_to_screen_name":"natolambert","lang":"en","retweeted":false,"fact_check":null,"id":"1980289506885726253","view_count":1763,"bookmark_count":0,"created_at":1760972785000,"favorite_count":27,"quote_count":0,"reply_count":0,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980274175349571950","full_text":"@natolambert Congrats!! A verifiable reward if I ever saw one. đ","in_reply_to_user_id_str":"2939913921","in_reply_to_status_id_str":"1980274175349571950","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-22","value":5200,"startTime":1761004800000,"endTime":1761091200000,"tweets":[{"bookmarked":false,"display_text_range":[0,110],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1980374523846517195","quoted_status_permalink":{"url":"https://t.co/HS0MhdiXEr","expanded":"https://twitter.com/nousresearch/status/1980374523846517195","display":"x.com/nousresearch/sâŠ"},"retweeted":false,"fact_check":null,"id":"1980561520984748497","view_count":5200,"bookmark_count":6,"created_at":1761037638000,"favorite_count":21,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1980561520984748497","full_text":"Stoked to be speaking! \n\nWhatâs needed beyond RLVR to enable rich, long-horizon reasoning? Come and find out đ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-23","value":0,"startTime":1761091200000,"endTime":1761177600000,"tweets":[]},{"label":"2025-10-24","value":0,"startTime":1761177600000,"endTime":1761264000000,"tweets":[]},{"label":"2025-10-25","value":0,"startTime":1761264000000,"endTime":1761350400000,"tweets":[]},{"label":"2025-10-26","value":11009,"startTime":1761350400000,"endTime":1761436800000,"tweets":[{"bookmarked":false,"display_text_range":[0,87],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"favorited":false,"lang":"en","quoted_status_id_str":"1982093363928305903","quoted_status_permalink":{"url":"https://t.co/WonTShx3j9","expanded":"https://twitter.com/natolambert/status/1982093363928305903","display":"x.com/natolambert/stâŠ"},"retweeted":false,"fact_check":null,"id":"1982097453420548556","view_count":11009,"bookmark_count":32,"created_at":1761403833000,"favorite_count":42,"quote_count":0,"reply_count":1,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982097453420548556","full_text":"This is an important piece. Please work hard but remember to take care of yourselves đ.","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-10-27","value":0,"startTime":1761436800000,"endTime":1761523200000,"tweets":[]},{"label":"2025-10-28","value":1531,"startTime":1761523200000,"endTime":1761609600000,"tweets":[{"bookmarked":false,"display_text_range":[15,289],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"598951979","name":"Arun Rao","screen_name":"sudoraohacker","indices":[0,14]}]},"favorited":false,"in_reply_to_screen_name":"sudoraohacker","lang":"en","retweeted":false,"fact_check":null,"id":"1982731600182841497","view_count":1531,"bookmark_count":2,"created_at":1761555025000,"favorite_count":7,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1982580105932370431","full_text":"You have a point when it comes to product insights, but less so when it comes to research - most of the big players (especially GDM) have a large research presence in London. \n\nBecause SF is so product focused, I actually find the average person to be behind on the level of insights about the next generation research compared to London (SF is very reactionary - eg see latest RL/environments wave). But it really depends on who you speak to.","in_reply_to_user_id_str":"598951979","in_reply_to_status_id_str":"1982580105932370431","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-10-29","value":0,"startTime":1761609600000,"endTime":1761696000000,"tweets":[]},{"label":"2025-10-30","value":0,"startTime":1761696000000,"endTime":1761782400000,"tweets":[]},{"label":"2025-10-31","value":0,"startTime":1761782400000,"endTime":1761868800000,"tweets":[]},{"label":"2025-11-01","value":0,"startTime":1761868800000,"endTime":1761955200000,"tweets":[]},{"label":"2025-11-02","value":0,"startTime":1761955200000,"endTime":1762041600000,"tweets":[]},{"label":"2025-11-03","value":0,"startTime":1762041600000,"endTime":1762128000000,"tweets":[]},{"label":"2025-11-04","value":0,"startTime":1762128000000,"endTime":1762214400000,"tweets":[]},{"label":"2025-11-05","value":4453,"startTime":1762214400000,"endTime":1762300800000,"tweets":[{"bookmarked":false,"display_text_range":[0,82],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[{"display_url":"x.com/withoutwarny/sâŠ","expanded_url":"https://x.com/withoutwarny/status/1476918697839280133?s=46","url":"https://t.co/MgbEBWDOxC","indices":[59,82]}],"user_mentions":[]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"quoted_status_id_str":"1983885698773164383","quoted_status_permalink":{"url":"https://t.co/PLAuirhrIz","expanded":"https://twitter.com/matthewclifford/status/1983885698773164383","display":"x.com/matthewclifforâŠ"},"retweeted":false,"fact_check":null,"id":"1985659523806450040","view_count":4453,"bookmark_count":2,"created_at":1762253096000,"favorite_count":10,"quote_count":0,"reply_count":1,"retweet_count":1,"user_id_str":"524807755","conversation_id_str":"1985659523806450040","full_text":"Matt should obviously be PM, but not before Neil Warnock.\n\nhttps://t.co/MgbEBWDOxC","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":1,"is_ai":null,"ai_score":null}]},{"label":"2025-11-06","value":0,"startTime":1762300800000,"endTime":1762387200000,"tweets":[]},{"label":"2025-11-07","value":1117,"startTime":1762387200000,"endTime":1762473600000,"tweets":[{"bookmarked":false,"display_text_range":[17,131],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"70831441","name":"Soumith Chintala","screen_name":"soumithchintala","indices":[0,16]}]},"favorited":false,"in_reply_to_screen_name":"soumithchintala","lang":"en","retweeted":false,"fact_check":null,"id":"1986511758295470435","view_count":1117,"bookmark_count":0,"created_at":1762456285000,"favorite_count":2,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1986503070734557568","full_text":"@soumithchintala Thanks for everything â€ïž \n\nSo cool that youâre trying something new. Stoked / very intrigued to see whatâs next đ.","in_reply_to_user_id_str":"70831441","in_reply_to_status_id_str":"1986503070734557568","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-08","value":0,"startTime":1762473600000,"endTime":1762560000000,"tweets":[]},{"label":"2025-11-09","value":0,"startTime":1762560000000,"endTime":1762646400000,"tweets":[]},{"label":"2025-11-10","value":0,"startTime":1762646400000,"endTime":1762732800000,"tweets":[]},{"label":"2025-11-11","value":0,"startTime":1762732800000,"endTime":1762819200000,"tweets":[]},{"label":"2025-11-12","value":0,"startTime":1762819200000,"endTime":1762905600000,"tweets":[]},{"label":"2025-11-13","value":25612,"startTime":1762905600000,"endTime":1762992000000,"tweets":[{"bookmarked":false,"display_text_range":[0,280],"entities":{"hashtags":[],"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[]},"extended_entities":{"media":[{"display_url":"pic.x.com/3bDz8v0kxx","expanded_url":"https://x.com/rosstaylor90/status/1988523696017863109/photo/1","id_str":"1988523691747971072","indices":[281,304],"media_key":"3_1988523691747971072","media_url_https":"https://pbs.twimg.com/media/G5inwxIWEAAd6q3.jpg","type":"photo","url":"https://t.co/3bDz8v0kxx","ext_media_availability":{"status":"Available"},"features":{"large":{"faces":[]},"medium":{"faces":[]},"small":{"faces":[]},"orig":{"faces":[]}},"sizes":{"large":{"h":600,"w":400,"resize":"fit"},"medium":{"h":600,"w":400,"resize":"fit"},"small":{"h":600,"w":400,"resize":"fit"},"thumb":{"h":150,"w":150,"resize":"crop"}},"original_info":{"height":600,"width":400,"focus_rects":[{"x":0,"y":376,"w":400,"h":224},{"x":0,"y":200,"w":400,"h":400},{"x":0,"y":144,"w":400,"h":456},{"x":100,"y":0,"w":300,"h":600},{"x":0,"y":0,"w":400,"h":600}]},"allow_download_status":{"allow_download":true},"media_results":{"result":{"media_key":"3_1988523691747971072"}}}]},"favorited":false,"lang":"en","possibly_sensitive":false,"possibly_sensitive_editable":true,"retweeted":false,"fact_check":null,"id":"1988523696017863109","view_count":25465,"bookmark_count":19,"created_at":1762935968000,"favorite_count":97,"quote_count":5,"reply_count":5,"retweet_count":6,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"Lots of negative talk about the UK on my timeline recently. \n\nFun fact: much of the team that was behind successful Llama 2/3 releases - which kicked off the open LLM revolution - were based in London. There is much more AI talent here than just DeepMind.\n\nEveryone is looking for a Western alternative to Chinese open source, but maybe they should look where the modern West began? đŠ","in_reply_to_user_id_str":null,"in_reply_to_status_id_str":null,"is_quote_status":0,"is_ai":null,"ai_score":null},{"bookmarked":false,"display_text_range":[19,150],"entities":{"hashtags":[],"symbols":[],"timestamps":[],"urls":[],"user_mentions":[{"id_str":"851226008770547713","name":"Felix Drost","screen_name":"felix_drost","indices":[0,12]},{"id_str":"64844802","name":"a16z","screen_name":"a16z","indices":[13,18]}]},"favorited":false,"in_reply_to_screen_name":"felix_drost","lang":"en","retweeted":false,"fact_check":null,"id":"1988538138113810892","view_count":147,"bookmark_count":0,"created_at":1762939412000,"favorite_count":8,"quote_count":0,"reply_count":0,"retweet_count":0,"user_id_str":"524807755","conversation_id_str":"1988523696017863109","full_text":"@felix_drost @a16z âBritain is too smallâ could have been said many times in the past⊠and yet we became a great power. Why shouldnât it happen again?","in_reply_to_user_id_str":"851226008770547713","in_reply_to_status_id_str":"1988536333770412435","is_quote_status":0,"is_ai":null,"ai_score":null}]},{"label":"2025-11-14","value":0,"startTime":1762992000000,"endTime":1763078400000,"tweets":[]}]},"interactions":{"users":[{"created_at":1491783070000,"uid":"851226008770547713","id":"851226008770547713","screen_name":"felix_drost","name":"Felix Drost","friends_count":469,"followers_count":578,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1733182141654056961/BmkenBC7_normal.jpg","description":"Armchair O7 Once an anthropologist & soldier.\n\nđȘđșđłđ±","entities":{"description":{"urls":[]}},"interactions":1},{"created_at":1721463779000,"uid":"1814576577637666816","id":"1814576577637666816","screen_name":"halfatheist","name":"HalfAtheist","friends_count":60,"followers_count":254,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1862498254971445255/jTmnS-nM_normal.jpg","description":"Truth-seeker. Can handle nuance.\n\nđč $BMNR $KDK $JOBY","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"linktr.ee/halfatheist","expanded_url":"https://linktr.ee/halfatheist","url":"https://t.co/Mq1GyIrc8y","indices":[0,23]}]}},"interactions":1},{"created_at":1591728953000,"uid":"1270429378246213632","id":"1270429378246213632","screen_name":"joythw","name":"Tom Joy","friends_count":191,"followers_count":190,"profile_image_url_https":"https://pbs.twimg.com/profile_images/1834987722270031872/qGDfPsc0_normal.jpg","description":"Building something new.\nGirlsWhoML co-founder @GirlsWhoML. \nPhD in AI @OxfordTVG. \nPrev Research Scientist @_FiveAI, @Meta, @SLAMcoreLtd\nhe/him","entities":{"description":{"urls":[]},"url":{"urls":[{"display_url":"thwjoy.github.io","expanded_url":"http://thwjoy.github.io","url":"https://t.co/Cl5sD0zwcd","indices":[0,23]}]}},"interactions":1}],"period":14,"start":1761805103584,"end":1763014703584}}},"settings":{},"session":null,"routeProps":{"/creators/:username":{}}}