Platform engineering and internal developer portals have been a growing trend in the tech industry to make developers more efficient. For example, how do we help new developers ship their first feature faster? GraphQL helps Platform API efforts ship features faster, but what about when your schema gets very complex? How can a new developer find what they need quickly? GraphQL already provides a complete and understandable description of the data in our APIs, but what if we provide that context to a LLM? In this talk, we'll journey through GitHub's APIs and explore how a GraphQL schema is a significant advantage in AI-based tooling. We're seeing more AI-based tools generate fetch code based on OpenAPI definitions, and while they may be tempting at first, it could be a decision with unexpected trade-offs. We'll show how to take a standard open-sourced LLM and provide a GraphQL-aware context to generate operations from text input. After this talk, you can safely bring AI to your developer efficiency initiatives with any LLM, 3rd party, or self-hosted!