Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Batching RTK get query #4787

Open
nouman91 opened this issue Dec 20, 2024 · 1 comment
Open

Batching RTK get query #4787

nouman91 opened this issue Dec 20, 2024 · 1 comment

Comments

@nouman91
Copy link

Hello, and apologies if this has been asked or answered elsewhere—I tried searching but couldn't find relevant information.

I have a use case where I need to fetch data for hundreds of UUIDs. However, our Nginx configuration does not support large query parameters, so I am considering batching the requests. I wanted to check if RTK Query has built-in support for batch fetching or if there's a recommended approach for this scenario.

My use case:

graphQuery: () => {
  querFn: () => {
    const graph = fetchBaseGraph();
    const nodes1UUIDs = graph.nodes.map(node => node.node1);
    const nodes2UUIDs = graph.nodes.map(node => node.node2);
    const nodes3UUIDs = graph.nodes.map(node => node.node3);

    /** Each type of node corresponds to a different endpoint */
    fetchNode1WithBatch(nodes1UUIDs);
    fetchNode2WithBatch(nodes2UUIDs);
    fetchNode3WithBatch(nodes3UUIDs);
  };
};

Each node type requires its own endpoint, and I need to process these requests in batches due to the query length limitation. Does RTK Query offer support for batching, or is there a recommended pattern for handling this kind of use case?

Thank you for your time and guidance!

@markerikson
Copy link
Collaborator

There's nothing directly built in for that. Part of it is that RTKQ is not a "normalized" caching tool:

but also, it's ultimately about tracking the status and contents of a single promise per cache key.

Based on this example, are you intended for each of these items to end up as a separate cache entry? Or do these many items constitute a single cache entry, combined together?

If you're trying to treat these as separate cache entries in the end, we did recently add a new upsertQueryEntries API that's intended for efficiently inserting many cache entries at once:

Looking at your example, you could do the actual fetching work yourself (and treat this graphQuery endpoint as a sort of dummy, where you don't care about the actual saved cache entry but rather are using it to initiate the request), then use upsertQueryEntries to actually fill in the individual item cache entries.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants