
How AI Agents Assemble the Climate Narratives Report
This post explores how the assembler agent transforms raw data into the Climate Narratives Report, showcasing AI’s role in streamlining workflows and generating actionable insights.
This post explores how the assembler agent transforms raw data into the Climate Narratives Report, showcasing AI’s role in streamlining workflows and generating actionable insights.
This post explores the Climate Resilience Data Platform, showcasing how it sources climate articles, tracks social conversations, and classifies narratives. A behind-the-scenes look at refining data products to reveal societal insights and advance the mission of RepublicOfData.io.
The need for automation in data analysis, especially around complex topics like climate change, has never been greater. In this post, I’ll walk you through how I’m developing an Investigative Report AI Agent prototype as part of my Climate Resilience Data Platform. This agent processes large-scale online conversations
This prototype showcases an AI agent that investigates social conversations on climate change. It assesses discussions, gathers additional context from external tools, and generates summaries, demonstrating how AI can enrich data platforms with deeper insights.
Discover how we built a climate data platform using Dagster for orchestration, LangChain for AI agents, and OpenAI for data enrichment. The platform mines media articles and tracks social network conversations to analyze evolving climate narratives across time and geography.
Explore how climate change conversations differ across the U.S. with a map highlighting dominant discourse types in each state. See how regional attitudes shape the discussion on this issue.
Conversations Analyzer - Part 3 of 3 📚The Conversations Analyzer series: - Part 1: Building up a Local Generative AI Engine with Ollama, Fabric and OpenWebUI - Part 2: Optimizing and Engineering LLM Prompts with LangChain and LangSmith - Part 3: Enforcing Structure and Assembling our AI Agent This is
Conversations Analyzer - Part 2 of 3 📚The Conversations Analyzer series: - Part 1: Building up a Local Generative AI Engine with Ollama, Fabric and OpenWebUI - Part 2: Optimizing and Engineering LLM Prompts with LangChain and LangSmith - Part 3: Enforcing Structure and Assembling Our AI Agent In my
Dear Friends of RepublicOfData.io, Thank you for being a part of our RepublicOfData.io community! At some point in our journey, you showed interest in our mission, and for that, we are incredibly grateful. RepublicOfData.io began as a seed of intuition. Over the past year, it has evolved
Conversations Analyzer — Part 1 of 3 📚The Conversations Analyzer series: - Part 1: Building up a Local Generative AI Engine with Ollama, Fabric and OpenWebUI - Part 2: Optimizing and Engineering LLM Prompts with LangChain and LangSmith - Part 3: Enforcing Structure and Assembling Our AI Agent For the past
Introducing the New Social Signals Package Analytics extends far beyond just business data. In our digitally interconnected societies, we are constantly creating and interacting with an immense volume of social data. At RepublicOfData.io, we are deeply committed to responsibly tapping into this vast resource. Our approach prioritizes ethical harvesting,
An exposé of that narrative at AWS re:Invent Hey, thanks for reading this. My name is Olivier Dupuis and I’ve been building data products for the past 10 years or so. I’ve worked within many organizations, at different stages of data maturity and participated in scaling data
Considering different approaches to managing a data ecosystem I still remember the excitement of discovering dbt and the Modern Data Stack ecosystem more than 5 years ago. Letting go of homegrown ETL systems built on Python scripts, SQL stored procedures and cron jobs. How liberating it was to adopt a
Hey there, Let’s talk about ethical analytics, shall we? I feel like this should be as important as data quality, scalability, infrastructure costs, etc. Yet, we generally tend to avoid that topic. Or briefly talk about it when discussing risks. Shouldn’t it be a cornerstone of our practice,