In mid-December 2025, Tokyo hosted AI Builders Day, a one-day engineers' conference run by the Japanese AWS user group JAWS-UG that sought to take stock of three rapid years of change since the arrival of ChatGPT and to map the current practical landscape for AI agents and AI-driven development. The event, held at Sunshine City in Ikebukuro and restricted to an offline audience, drew far greater interest than its organisers anticipated, with more than 850 registrations against 650 advertised places and attendees spanning their 20s to 70s; roughly half reported this was their first in-person AWS community event. According to the event report, the conference opening was delivered by Minoru Oda, known as "みのるん", who framed the day as an accessible technical primer intended to prepare participants for the afternoon sessions. [1]
Oda, who was introduced as Japan’s first AWS "AI HERO" and serves as a tech evangelist at KDDI Agile Development Centre, used his keynote, titled "まだ間に合う! Agentic AI on AWSの現在地をやさしく一挙おさらい", to trace engineers' evolving challenges from the appearance of ChatGPT in November 2022 through to the present dominance of agentic approaches. He recounted his own route into the field , "AIが好きすぎて" and having used ChatGPT on its launch day , and used that narrative to explain why many organisations quickly moved in 2023 to embed generative AI via secure, API-based services. "そこからは生成AIという言葉を聞かない日はなくなりましたよね", he said, setting the scene for a technical review aimed at ensuring "午後からのセッションを最大限満喫いただけるようにおさらいします!" . [1]
A central thread of the keynote concerned the rise of Amazon Bedrock as the pragmatic on-ramp for companies wishing to call foundation models safely from within corporate applications. Oda highlighted Bedrock's serverless, pay-per-token model and its role in making diverse LLMs available over API, noting concrete cost examples for routine inference. But he also emphasised the operational realities that surfaced when teams tried to put models into production, chief among them the persistence of hallucination and the need to tie generation to verifiable data. According to the event coverage, that need is what drove widespread adoption of Retrieval Augmented Generation (RAG) patterns in 2024 and the emergence of AWS features to simplify RAG pipelines. [1][4]
To reduce the engineering barrier to RAG, Oda reviewed AWS tooling that automates ingestion, embedding and retrieval. He pointed to "Knowledge Bases for Amazon Bedrock", which ingests documents from S3 and other sources, converts text to vector embeddings and stores them for semantic retrieval, and he highlighted the lower-cost alternative Amazon recently provided with Amazon S3 Vectors as a way to use S3 as a vector index for affordable semantic search. The keynote framed these services as part of a continuum from bespoke RAG stacks to managed, integrated experiences that lower cost and operational complexity. Industry commentary and developer guides from AWS underline this trajectory, showing how knowledge-base tooling and retrievers such as Amazon Kendra and OpenSearch Serverless are commonly combined in production RAG systems. [1][7][4]
Beyond retrieval, the talk explained the behavioural architecture underlying modern agents: the ReAct loop of Reasoning, Acting and Observation. Oda described how agents decompose tasks into plans, call out to tools such as API endpoints or internal systems, observe outcomes and iterate when results are unsatisfactory , a pattern that, he argued, underpins practical multi-step workflows such as aggregating survey data and producing presentation artefacts. That practical shift , from single-turn chat to tool-enabled, iterative agents , is precisely what Amazon Bedrock Agents was built to simplify, offering a GUI-driven way to wire foundation models, Lambda tooling, knowledge bases and guardrails into single- or multi-agent solutions. [1][4]
AWS itself has signalled that agentic AI is a strategic priority beyond product updates. At the AWS Summit in New York in 2025, the company unveiled Bedrock AgentCore and seven supporting services to help enterprises deploy and operate secure AI agents at scale, expanded marketplace listings and committed a $100 million investment to accelerate agentic AI development. According to AWS communications, those announcements aim to give organisations a clearer enterprise pathway from prototyping to large-scale, governed deployments. Local AWS initiatives , including prototyping camps and hands-on workshops in Tokyo and Houston earlier in 2025 , reinforce the company's push to equip developers with both the tools and the applied know-how to move agentic designs into production. [2][3][5][6]
Taken together, the presentations and surrounding AWS activity sketch a fast-maturing ecosystem in which managed platforms, lower-cost vector indexing and agent orchestration aim to turn experimental generative systems into operational capabilities. Yet the keynote and related developer guidance also make clear that important work remains: engineering reliable retrievers and re-rankers, designing robust guardrails to limit hallucination and securing agent operations at enterprise scale are still active challenges. Speaking to that duality, the event framed Bedrock and related services as pragmatic enablers rather than silver bullets , tools that reduce friction while leaving substantive implementation and governance decisions to engineering teams. [1][2][4][7]
📌 Reference Map:
##Reference Map:
- [1] (ASCII.jp) - Paragraph 1, Paragraph 2, Paragraph 3, Paragraph 4, Paragraph 5, Paragraph 7
- [4] (AWS Builders Flash) - Paragraph 3, Paragraph 5, Paragraph 6
- [7] (SO Technologies developer blog) - Paragraph 4, Paragraph 7
- [2] (AWS blog post, AWS Summit New York 2025) - Paragraph 6
- [3] (About Amazon news) - Paragraph 6
- [5] (AWS Startup Blog) - Paragraph 6
- [6] (AWS workshop page) - Paragraph 6
Source: Noah Wire Services