Below example illustrates simple GraphQL proxy combining two remote GraphQL schemas into single API:
from ariadne.asgi import GraphQL
from ariadne_graphql_proxy import ProxySchema, get_context_value
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/e-commerce/")
proxy_schema.add_remote_schema("https://example.com/product-reviews/")
final_schema = proxy_schema.get_final_schema()
app = GraphQL(
final_schema,
context_value=get_context_value,
root_value=proxy_schema.root_resolver,
debug=True,
)
Let's walk this code step by step:
ProxySchema
is a factory object provided by Ariadne GraphQL Proxy. This objects stores GraphQL schemas, rules for modifying them, and then gives you back final schema and root resolver that knows how to split GraphQL query received by proxy between schemas that combine.
Creating an instance of ProxySchema
is first step to creating GraphQL proxy:
from ariadne_graphql_proxy import ProxySchema
proxy_schema = ProxySchema()
ProxySchema
has two methods that can be used to add schemas to proxy: add_remote_schema
and add_schema
.
Our proxy uses add_remote_schema
method to combine two GraphQL remote schemas into final schema:
from ariadne_graphql_proxy import ProxySchema
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/e-commerce/")
proxy_schema.add_remote_schema("https://example.com/product-reviews/")
When application starts, add_remote_schema
takes the URL of remote GraphQL API supporting introspection, runs introspection query, and then includes its result schema in ProxySchema
.
add_remote_schema
also accepts optional arguments that can be used to exclude parts of remote schema from final schema used by proxy. Those are documented in later part of this document.
get_final_schema
method of ProxySchema
produces final GraphQLSchema
instance from schemas previously added to proxy with either add_remote_schema
or add_schema
:
from ariadne_graphql_proxy import ProxySchema
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/e-commerce/")
proxy_schema.add_remote_schema("https://example.com/product-reviews/")
final_schema = proxy_schema.get_final_schema()
Final schema:
- merges schemas into single
GraphQLSchema
- records data about those schemas fields origin
- created query filter that
root_resolver
uses to split query into smaller queries sent to remote schemas and local schema.
Ariadne GraphQL Proxy doesn't provide it's own server component. Instead you should use Ariadne's asgi.GraphqL
app to create server:
from ariadne.asgi import GraphQL
from ariadne_graphql_proxy import ProxySchema, get_context_value
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/e-commerce/")
proxy_schema.add_remote_schema("https://example.com/product-reviews/")
final_schema = proxy_schema.get_final_schema()
app = GraphQL(
final_schema,
context_value=get_context_value,
root_value=proxy_schema.root_resolver,
debug=True,
)
Note that this server is initialized with context_value
and root_value
. Those are required by proxy to work:
context_value
exposes request data such as headers toroot_resolver
.root_resolver
acts as top level resolver that uses query filter to split GraphQL query received by proxy and call remote schemas to retrieve their data.
You can start your server using any ASGI server. For example, using Uvicorn:
pip install uvicorn
uvicorn my-proxy:app -p 8000
Assuming the above Python code is living in my-proxy.py
, uvicorn will start the proxy server on 127.0.0.:8000 address on your computer.
ProxySchema.get_final_schema
returns GraphQLSchema
instance which can be additionally mutated to set custom resolvers on it's fields.
ariadne_graphql_proxy
exports set_resolver
and unset_resolver
utils that can be used to set custom resolver functions on GraphQLSchema
schema fields:
final_schema = proxy_schema.get_final_schema()
def custom_resolver(obj, info):
...
set_resolver(final_schema, "Query", "products", custom_resolver)
If field you are setting custom resolver for comes from remote schema, this field will be queried by root resolver before your resolver will be ran for it.
If you want to exclude the field from root resolver, set this field as delayed using ProxySchema
's add_delayed_fields
method:
proxy_schema.add_delayed_fields({"Query": ["products"]})
final_schema = proxy_schema.get_final_schema()
def custom_resolver(obj, info):
...
set_resolver(final_schema, "Query", "products", custom_resolver)
If field you are setting resolver on already has a resolver on it, this resolver will be replaced with new one.
If field you are setting custom resolver for comes from remote schema, you will have to first set this field as delayed, using ProxySchema
's add_delayed_fields
method:
proxy_schema.add_delayed_fields({"Query": ["products"]})
final_schema = proxy_schema.get_final_schema()
def custom_products_resolver(obj, info):
...
set_resolver(final_schema, "Query", "products", custom_products_resolver)
ProxyResolver
can be used to proxy portion of query from GraphQLResolveInfo
(info
argument) to given GraphQL's server:
resolve_products = ProxyResolver(
"https://example.com/e-commerce/",
proxy_headers=True,
)
set_resolver(final_schema, "Query", "products", resolve_products)
ProxyResolver
rewrites the GraphQL Query received by the server to only contain fields and arguments that apply for the selected field.
It requires single argument:
url
: astr
with URL to GraphQL API to which query should be proxied to.
It takes following optional arguments:
proxy_headers
:Union[bool, Callable, List[str]]
cache
:CacheBackend
cache_key
:Union[str, Callable[[GraphQLResolveInfo], str]]
cache_ttl
:int
proxy_headers
option is documented in "Configuring headers" section of this guide.
cache
, cache_key
and cache_ttl
arguments are documented in cache section of this guide.
Ariadne GraphQL Proxy supports relations between combined GraphQL Schemas. For example, one schema may implement a mutation returning a type, which is defined and retrieved from other schema:
type Query {
order(id: ID!): Order
}
type Order {
id: ID!
customer: Custom
billingAddress: Address!
shippingAddres: Address
payments: [Payment]
}
type Mutation {
checkoutComplete(checkoutId: ID!): CheckoutComplete!
}
type CheckoutComplete {
order: Order
errors: [String!]
}
type Order {
id: ID!
}
Notice how one service implements Order
type and order
Query
field while the other defines only small Order
type defining its ID, but also a mutation that may return this Order
.
Once those schemas are combined in ProxySchema
, following schema will be produced:
type Query {
order(id: ID!): Order
}
type Mutation {
checkoutComplete(checkoutId: ID!): CheckoutComplete!
}
type Order {
id: ID!
customer: Custom
billingAddress: Address!
shippingAddres: Address
payments: [Payment!]!
}
type CheckoutComplete {
order: Order
errors: [String!]
}
But there is a problem now. Because GraphQL API implementing checkoutComplete
only returns id
field for order, querying order's other non-nullable fields like billingAddress
or payments
will result in GraphQL proxy raising error about those non-nullable fields receiving None
as value. This is because proxy knows that order field on CheckoutComplete
doesn't support those fields, so it excludes them from query.
We can solve this error by implementing those fields in both schemas, but we can also tell ProxySchema
to retrieve remaining fields from other type, using order's id already returned from other service.
In the GraphQL Proxy relations are resolved in two steps:
- A field in schema that returns object is set as foreign key, making the root resolver rewrite queries with this field to always query for requested object's identifier, and identifier only.
- After root resolver retrieved related object's identifier from one query, it passes this identifier to field's resolver, in order for it to retrieve remaining data from other service.
Looking at above example, this query:
mutation CompleteCheckout {
checkoutComplete(id: "dsa987dsa97dsa98") {
order {
id
billingAddress
shippingAddress
payments {
id
method
status
amount {
currency
amount
}
}
}
errors
}
}
Will be slit into two queries, first one being executed by root resolver:
mutation CompleteCheckout {
checkoutComplete(id: "dsa987dsa97dsa98") {
order {
id
}
errors
}
}
And then order's id together with its fields from first query will be passed to CheckoutComplete
's order
resolver, which's goal will be to execute following query to retrieve remaining data:
query GetForeignKeyOrder($id: ID!) {
order(id: $id) {
id
billingAddress
shippingAddress
payments {
id
method
status
amount {
currency
amount
}
}
}
}
Finally GraphQL's query executor implemented by the graphql-core
library will combine both data from root resolver and data from field's resolver into final query result.
We start with using the add_foreign_key
method to tell ProxySchema
to always request an id
field when order
field is queried on CheckoutComplete
:
from ariadne_graphql_proxy import ProxySchema
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/store/")
proxy_schema.add_remote_schema("https://example.com/checkout/")
proxy_schema.add_foreign_key("CheckoutComplete", "order", "id")
final_schema = proxy_schema.get_final_schema()
This will guarantee that when order
field is on CheckoutComplete
is queried, this part of query will be always rewritten by root resolver to order { id }
before being sent to the checkout service.
Relation is now missing the logic that will take the order id retrieved by the root resolver and other queried fields for order
and query the other service to retrieve those fields values.
You can implement this logic completely by yourself in custom resolver for CheckoutComplete.order
field, but Ariadne GraphQL Proxy already provides an utility ForeignKeyResolver
that implements this logic:
from ariadne_graphql_proxy import ForeignKeyResolver, ProxySchema, set_resolver
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/store/")
proxy_schema.add_remote_schema("https://example.com/checkout/")
proxy_schema.add_foreign_key("CheckoutComplete", "order", "id")
final_schema = proxy_schema.get_final_schema()
fk_order_resolver = ForeignKeyResolver(
"https://example.com/store/",
"""
query GetForeignKeyOrder($id: ID!) {
order(id: $id) {
__FIELDS
}
}
"""
)
set_resolver(final_schema, "CheckoutComplete", "order", fk_order_resolver)
ForeignKeyResolver
class requires two configuration options to work:
- an URL of GraphQL API to query to retrieve the related object.
- The GraphQL query to use to retrieve the related object.
The query provided to ForeignKeyResolver
is a template. __FIELDS
field in it is not magical feature of GraphQL - it is in fact a placeholder in template that is replaced with final requested fields before query is sent to the service. __FIELDS
will be replaced with whatever fields were originally requested for order
.
Final GraphQL proxy implementation implementing a foreign key looks like this:
from ariadne.asgi import GraphQL
from ariadne_graphql_proxy import (
ForeignKeyResolver,
ProxySchema,
get_context_value,
set_resolver,
)
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://example.com/e-commerce/")
proxy_schema.add_remote_schema("https://example.com/product-reviews/")
proxy_schema.add_foreign_key("CheckoutComplete", "order", "id")
final_schema = proxy_schema.get_final_schema()
fk_order_resolver = ForeignKeyResolver(
"https://example.com/store/",
"""
query GetForeignKeyOrder($id: ID!) {
order(id: $id) {
__FIELDS
}
}
"""
)
set_resolver(final_schema, "CheckoutComplete", "order", fk_order_resolver)
app = GraphQL(
final_schema,
context_value=get_context_value,
root_value=proxy_schema.root_resolver,
debug=True,
)
Following APIs support creating a new schema that is a subset of another schema:
ProxySchema.add_remote_schema
ProxySchema.add_schema
copy_schema
To create a subset of other schema, specify which fields for queries
and (optionally) mutations
should be available in final schema:
from ariadne_graphql_proxy import ProxySchema
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema(
"https://example.com/e-commerce/",
queries=["categories", "products"],
)
All other Query
fields and types that weren't used by those fields will be removed from the final schema, making it much smaller. If mutations
option was not used, Mutation
type will also be removed from the schema.
queries
and mutations
arguments can be combined with exclude_types
, exclude_fields
, exclude_args
, exclude_directives
and exclude_directives_args
.
get_query_params_resolver
returns a preconfigured resolver that takes URL string and passed arguments to generate a URL with arguments as query params. It can be used to add rendering options to imgix.com image URL.
get_url
: astr
orCallable
which returnsstr
. Ifget_url
is astr
then the resolver will split it by.
and use substrings as keys to get value fromobj
dict or as attribute names for non dict objects, e.g. withget_url
set to"imageData.url"
the resolver will use one of:obj["imageData"]["url"]
,obj["imageData"].url
,obj.imageData["url"]
,obj.imageData.url
as URL string. Ifget_url
is a callable, then resolver will call it withobj
,info
and**kwargs
and use result as URL string.extra_params
: an optionaldict
of query params to be added to the URL string. These can be overridden by kwargs passed to the resolver.get_params
: an optionalCallable
to be called on passed**kwargs
before they are added to the URL string.serialize_url
: an optionalCallable
to be called on URL string with query params already added. Result is returned directly by the resolver.
In this example we assume there is a graphql server which provides following schema:
type Query {
product: Product!
}
type Product {
imageUrl: String!
}
imageUrl
returns URL string served by imgix.com and we want to add another field with thumbnail URL.
from ariadne_graphql_proxy import ProxySchema, get_context_value, set_resolver
from ariadne_graphql_proxy.contrib.imgix import get_query_params_resolver
proxy_schema = ProxySchema()
proxy_schema.add_remote_schema("https://remote-schema.local")
proxy_schema.insert_field(
type_name="Product",
field_str="thumbnailUrl(w: Int, h: Int): String!",
)
final_schema = proxy_schema.get_final_schema()
set_resolver(
final_schema,
"Product",
"thumbnailUrl",
get_query_params_resolver(
"imageUrl",
extra_params={"h": 128, "w": 128, "fit": "min"},
),
)
With an added resolver, thumbnailUrl
will return imageUrl
with additional query parameters. fit
is always set to min
. w
and h
are set to 128
by default, but can be changed by query argument, e.g.
query getProduct {
product {
imageUrl
thumbnailUrl
smallThumbnailUrl: thumbnailUrl(w: 32, h: 32)
}
}
{
"data": {
"product": {
"imageUrl": "https://test-imageix.com/product-image.jpg",
"thumbnailUrl": "https://test-imageix.com/product-image.jpg?h=128&w=128&fit=min",
"smallThumbnailUrl": "https://test-imageix.com/product-image.jpg?h=32&w=32&fit=min"
}
}
}
Ariadne GraphQL Proxy requires that GraphQLResolveInfo.context
attribute is a dictionary containing headers
key, which in itself is a Dict[str, str]
dictionary.
get_context_value
utility importable form ariadne_graphql_proxy
is a convenience utility which is compatible with Ariadne's ASGI application and returns a dict
with request
and headers
keys. Header names are normalized to lowercase, eg.: Authorization
header will be available as context["headers"]["authorization"]
if it was included in request to the GraphQL server.
It is possible to configure headers per schema with second positional argument of the add_remote_schema
method:
schema.add_remote_schema("https://example.com/graphql", {"Authorization": "Bearer T0K3N"})
Configured headers will be included in all HTTP requests to https://example.com/graphql
made by the ProxySchema
. This excludes requests made by ForeignKeyResolver
and ProxyResolver
which require headers to be configured on them separately.
If you need to create headers from context
(eg. to proxy authorization header), you can use a function instead of a dict
:
def get_proxy_schema_headers(context):
if not context:
# Context is not available when `ProxySchema` retrieves remote schema for the first time
return {"Authorization": "Bearer T0K3N"}
return context.get("headers")
schema.add_remote_schema("https://example.com/graphql", get_proxy_schema_headers)
Both foreign key and proxy resolvers constructors take proxy_headers
as second option. This option controls headers proxying:
If this option is not set, no headers are set on proxied queries.
If proxy_headers
is True
and context["headers"]
dictionary exists, its authorization
value will be proxied.
If proxy_headers
is a List[str]
, its assumed to be a list of names of headers that should be proxied from context["headers"]
.
If proxy_headers
is a callable, it will be called with single argument (context
) and should return either None
or Dict[str, str]
with headers to send to the other server.
If proxy_headers
is None
or False
, no headers are proxied to the other service.
In situations where field depends on data from sibling fields in order to be resolved, ProxySchema
can be configured to include those additional fields in root value query sent to remote schema.
Below example pulls a remote schema that defines Product
type, extends this type with image: String
field, and then uses ProxySchema.add_field_dependencies
to configure { metadata { thumb} }
as additional fields to retrieve when image
field is queried. It also includes custom resolver for image
field that uses this additional data:
from ariadne.asgi import GraphQL
from ariadne_graphql_proxy import (
ProxySchema,
get_context_value,
set_resolver,
)
from graphql import build_ast_schema, parse
proxy_schema = ProxySchema()
# Store schema ID for remote schema
remote_schema_id = proxy_schema.add_remote_schema(
"https://example.com/graphql/",
)
# Extend Product type with additional image field
proxy_schema.add_schema(
build_ast_schema(
parse(
"""
type Product {
image: String
}
"""
)
)
)
# Configure proxy schema to retrieve thumb from metadata
# from remote schema when image is queried
proxy_schema.add_field_dependencies(
remote_schema_id, "Product", "image", "{ metadata { thumb } }"
)
# Create schema instance
final_schema = proxy_schema.get_final_schema()
# Add product image resolver
def resolve_product_image(obj, info):
return obj["metadata"]["thumb"]
set_resolver(final_schema, "Product", "image", resolve_product_image)
# Setup Ariadne ASGI GraphQL application
app = GraphQL(
final_schema,
context_value=get_context_value,
root_value=proxy_schema.root_resolver,
debug=True,
)
Ariadne GraphQL Proxy implements basic cache framework that enables of caching parts of GraphQL queries.
Currently only a simple in memory cache backend is provided, but developers may implement their own backends.
All cache utilities are importable from the ariadne_graphql_proxy.cache
package.
Note: If you are using
ProxySchema
, remember to exclude fields you are going to cache from root resolver withadd_delayed_fields
method, or your data will not be cached!
A decorator for resolvers that caches their results for given resolver arguments:
from ariadne_graphql_proxy.cache import InMemoryCache, simple_cached_resolver
cache_backend = InMemoryCache()
@simple_cached_resolver(cache_backend, "products")
def resolve_products(_, info, **filters):
# Resolve products using filters
It requires two arguments:
backend
: aCacheBackend
subclass instance which will be used to cache resolver's resultsprefix
: astr
with cache prefix orCallable[[GraphQLResolveInfo], str]
used to obtain this prefixstr
frominfo
, combined with resolver's arguments to create final cache key.
It also has following optional arguments:
ttl
: anint
with a time to live for cache value, in seconds.
A decorator for resolvers that caches their results for given resolver arguments and selected fields:
from ariadne_graphql_proxy.cache import InMemoryCache, cached_resolver
cache_backend = InMemoryCache()
@cached_resolver(cache_backend, "products")
def resolve_products(_, info, **filters):
# Resolve products using filters
It requires two arguments:
backend
: aCacheBackend
subclass instance which will be used to cache resolver's resultsprefix
: astr
with cache prefix orCallable[[GraphQLResolveInfo], str]
used to obtain this prefixstr
frominfo
, combined with resolver's arguments and queried fields to create final cache key.
It also has following optional arguments:
ttl
: anint
with a time to live for cache value, in seconds.
Both ForeignKeyResolver
and ProxyResolver
accept caching options:
cache
:Optional[CacheBackend]
:CacheBackend
to use to cache results.cache_key
:str
with cache prefix orCallable[[GraphQLResolveInfo], str]
used to obtain this prefixstr
frominfo
, combined with resolver's arguments and queried fields to create final cache key.cache_ttl
: anint
with a time to live for cache value, in seconds.
To enable cache, cache
and cache_key
need to be set.
Custom cache backends should extend ariadne_graphql_proxy.cache.CacheBackend
class and need to implement set
and get
methods:
class CacheBackend:
async def set(self, key: str, value: Any, ttl: Optional[int] = None):
...
async def get(self, key: str, default: Any = None) -> Any:
...
__init__
methods takes serializer
argument which defaults to NoopCacheSerializer()
and can be overriden:
class CacheBackend:
def __init__(self, serializer: Optional[CacheSerializer] = None) -> None:
self.serializer: CacheSerializer = serializer or NoopCacheSerializer()
They can also optionally implement clear_all
method, but its not used by Ariadne GraphQL Proxy outside of tests:
class CacheBackend:
async def clear_all(self):
...
Currently only NoopCacheSerializer
and JSONCacheSerializer
are provided, but developers may implement their own serializers.
Custom cache serializers should extend ariadne_graphql_proxy.cache.CacheSerializer
class and need to implement serialize
and deserialize
methods:
class CacheSerializer:
def serialize(self, value: Any) -> str:
...
def deserialize(self, value: str) -> Any:
...
CloudflareCacheBackend
uses Cloudflare's key value storage for caching. It can be imported from ariadne_graphql_proxy.contrib.cloudflare
and requires following arguments:
account_id
:str
: Id of Cloudflare account.namespace_id
:str
: Id of worker's KV Namespace.headers
:Optional[Dict[str, str]]
: Headers attached to every api call, defaults to{}
.base_url
:str
: Cloudflare API base url, defaults to"https://api.cloudflare.com/client/v4"
.serializer
:Optional[CacheSerializer]
: serialiser used to process cached and retrieved values, defaults toariadne_graphql_proxy.cache.JSONCacheSerializer()
.
from ariadne_graphql_proxy.contrib.cloudflare import CloudflareCacheBackend
cache = CloudflareCacheBackend(
account_id="account id",
namespace_id="workers kv namespace id",
headers={"Authorization": "Bearer ..."},
base_url="https://cloudflare_api_url/client/v4",
)
CloudflareCacheBackend
lists existing keys in given namespace on initialization to ensure it can be accessed, if this check fails it throws CloudflareCacheError
. To store value it performs PUT request, and to retrieve saved value it uses GET.
DynamoDBCacheBackend
uses Amazon DynamoDB for storing cached values. It requires boto3
package, which can be installed using pip:
pip install ariadne-graphql-proxy[aws]
It can be imported from ariadne_graphql_proxy.contrib.aws
and requires following arguments:
table_name
:str
: Name of DynamoDB table.partition_key
:str
: Partition key, defaults tokey
.ttl_attribute
:str
: TTL attribute, defaults tottl
.session
:Optional[Session]
: Instance ofboto3.session.Session
, defaults toSession()
which reads configuration values according to these docs.serializer
:Optional[CacheSerializer]
: serialiser used to process cached and retrieved values, defaults toariadne_graphql_proxy.cache.JSONCacheSerializer()
.
from ariadne_graphql_proxy.contrib.aws import DynamoDBCacheBackend
from boto3.session import Session
cache = DynamoDBCacheBackend(
table_name="table name",
partition_key="partition key",
ttl_attribute="ttl attribute",
session=Session(
aws_access_key_id="access key id",
aws_secret_access_key="secret",
region_name="region name",
),
)
DynamoDBCacheBackend
checks status of given table on initialization to ensure it can be accessed, if this check fails due to unavailable table it throws DynamoDBCacheError
.
DynamoDBCacheBackend
sets given ttl in Unix epoch time format. Expired items are excluded from results, but they aren't deleted from table, this is left to DynamoDB engine.
ProxySchema
class importable from ariadne_graphql_proxy
is a factory class for proxy GraphQL schemas.
It has following methods:
def __init__(self, root_value: Optional[RootValue] = None):
...
Constructor for ProxySchema
takes single optional argument, root_value
.
This argument's behavior is identical to root_value
option from Ariadne's GraphQL
server.
Its either root value to pass to Query
and Mutation
fields resolvers or first argument, or a callable that should return this value.
def add_remote_schema(
self,
url: str,
*,
exclude_types: Optional[List[str]] = None,
exclude_args: Optional[Dict[str, Dict[str, List[str]]]] = None,
exclude_fields: Optional[Dict[str, List[str]]] = None,
exclude_directives: Optional[List[str]] = None,
exclude_directives_args: Optional[Dict[str, List[str]]] = None,
extra_fields: Optional[Dict[str, List[str]]] = None,
) -> int:
...
Downloads remote GraphQL schema from the address in url
argument, modifies it using the specified options and adds it to the final schema.
Returns int
with sub schema ID that can be used to retrieve the schema using get_sub_schema
method.
url
: astr
with URL to remote GraphQL API that supports introspection.
exclude_types
: aList[str]
with names of GraphQL types from remote schema that should be excluded from downloaded schema. Eg.["CheckoutCreate", "CheckoutComplete"]
will remove bothCheckoutCreate
andCheckoutComplete
types, and all fields that use them as input values or return values.exclude_args
: aDict[str, Dict[str, List[str]]]
with names of arguments of fields that should be removed from downloaded schema. Eg.{"Query": {"users": ["search"]}}
will remove thesearch
argument fromusers
field onQuery
type.exclude_fields
: aDict[str, List[str]]
with names of fields that should be removed from downloaded schema. Eg.{"Query": ["webhooks"]}
will remove thewebhooks
field from theQuery
type.exclude_directives
: aList[str]
with names of directives that should be removed from downloaded schema. Eg.["auth"]
will remove the@auth
directive.exclude_directives_args
: aDict[str, List[str]]
with names of directives arguments that should be removed from downloaded schema. Eg.{"auth": ["roles"]}
will remove theroles
argument from@auth
directive.extra_fields
: aDict[str, List[str]]
with list of types fields that have been excluded from downloaded schema, but should still be queried, because their return values are compatible with final schema's field. Eg.{"CheckoutResult": ["error"]}
will make the root resolver still query theerror
field onCheckoutResult
type, even if it was excluded using one of above options.
def add_schema(
self,
schema: GraphQLSchema,
url: Optional[str] = None,
*,
exclude_types: Optional[List[str]] = None,
exclude_args: Optional[Dict[str, Dict[str, List[str]]]] = None,
exclude_fields: Optional[Dict[str, List[str]]] = None,
exclude_directives: Optional[List[str]] = None,
exclude_directives_args: Optional[Dict[str, List[str]]] = None,
extra_fields: Optional[Dict[str, List[str]]] = None,
) -> int:
...
Adds GraphQLSchema
instance as sub schema to include in final schema.
Returns int
with sub schema ID that can be used to retrieve the schema using get_sub_schema
method.
schema
: aGraphQLSchema
instance to include in final schema.
url
: astr
with URL to remote GraphQL API this schema fields should be resolved against. Used when sub schema represents a remote schema or its part.exclude_types
: aList[str]
with names of GraphQL types from remote schema that should be excluded from added schema. Eg.["CheckoutCreate", "CheckoutComplete"]
will remove bothCheckoutCreate
andCheckoutComplete
types, and all fields that use them as input values or return values.exclude_args
: aDict[str, Dict[str, List[str]]]
with names of arguments of fields that should be removed from added schema. Eg.{"Query": {"users": ["search"]}}
will remove thesearch
argument fromusers
field onQuery
type.exclude_fields
: aDict[str, List[str]]
with names of fields that should be removed from added schema. Eg.{"Query": ["webhooks"]}
will remove thewebhooks
field from theQuery
type.exclude_directives
: aList[str]
with names of directives that should be removed from added schema. Eg.["auth"]
will remove the@auth
directive.exclude_directives_args
: aDict[str, List[str]]
with names of directives arguments that should be removed from added schema. Eg.{"auth": ["roles"]}
will remove theroles
argument from@auth
directive.extra_fields
: aDict[str, List[str]]
with list of types fields that have been excluded from added schema, but should still be queried, because their return values are compatible with final schema's field. Eg.{"CheckoutResult": ["error"]}
will make the root resolver still query theerror
field onCheckoutResult
type, even if it was excluded using one of above options.
def add_delayed_fields(self, delayed_fields: Dict[str, List[str]]):
Sets specific fields in schema as delayed. Delayed fields are excluded from queries ran by root_resolver
against the remote GraphQL APIs.
This is a dict of type name and fields names lists:
{"Type": ["field", "otherField"], "OtherType": ["field"]}
def add_field_dependencies(
self, schema_id: int, type_name: str, field_name: str, query: str
):
Adds fields specified in query
as dependencies for field_name
of type_name
that should be retrieved from schema with schema_id
.
schema_id
: anint
with ID of schema returned byadd_remote_schema
oradd_schema
.type_name
: astr
with name of type for which dependencies will be set.field_name
: astr
with name of field which dependencies will be set.query
: astr
with additional fields to fetch whenfield_name
is included, eg.{ metadata { key value} }
.
def add_foreign_key(
self, type_name: str, field_name: str, on: Union[str, List[str]]
):
Sets specific field in schema as foreign key.
type_name
: astr
with name of type which's field will be set as foreign key.field_name
: astr
with name of field which will be set as foreign key.on
: astr
orList[str]]
with names of fields which should be queried when this field is included in query.
def get_sub_schema(self, schema_id: int) -> GraphQLSchema:
Returns sub schema with given id. If schema doesn't exist, raises IndexError
.
def insert_field(self, type_name: str, field_str: str):
Inserts field into all schemas with given type_name
. The field is automatically delayed - excluded from queries run by root_resolver
against the remote GraphQL APIs.
type_name
: astr
with the name of the type into which the field will be inserted.field_str
: astr
with SDL field representation, e.g."fieldA(argA: String!) Int"
.
def get_final_schema(self) -> GraphQLSchema:
...
Combines sub schemas into the single GraphQLSchema
object that can then be used with Ariadne GraphQL app to run a GraphQL server.
async def root_resolver(
self,
context_value: dict,
operation_name: Optional[str],
variables: Optional[dict],
document: DocumentNode,
) -> Optional[dict]:
...
An callable that should be passed to Ariadne GraphQL server's root_value
option. It retrieves the root value, splitting the original query and calling the remote GraphQL servers.