PipelineText → ImageMid creditEN

Federated Learning Round

Server orchestrates clients without seeing data — local training, weight upload, aggregation.

When to use this prompt

For privacy-preserving ML papers, edge-AI, healthcare and remote sensing federated setups.

The prompt

A federated learning round diagram, drawn as a star topology.

Center — Aggregation Server:
- Holds the global model w_t.
- Sends w_t to all clients at round start (broadcast arrows out).

Around — N=5 Clients:
- Each client i has private data D_i (drawn as a small cylinder beside the client).
- Each client trains for a few local epochs on D_i, producing updated weights w_t^i (or weight delta).
- Each client uploads w_t^i to the server (arrows in).

Server (next step):
- Aggregates client updates by weighted average: w_{t+1} = sum_i (n_i / N) * w_t^i.
- Broadcasts w_{t+1} to clients for the next round.

Show one round of the loop with a small "Round t -> t+1" annotation. Highlight in a small callout that "data never leaves clients".
Style: clean publication-style schematic, navy and teal palette, white background, thin connectors, suitable for medical / federated systems venues.

Variations

Differential-privacy variant

Add per-client noise injection (epsilon) before upload, and a callout: "Each client adds calibrated Gaussian noise to its update for (epsilon, delta)-DP."

Tips

  • Show "data never leaves clients" prominently. It is the defining property of federated learning.
  • Pick a star topology, not a linear pipeline. The visual centrality of the server matters.
  • Annotate the aggregation rule explicitly (FedAvg, FedProx, etc.).

FAQ

Try this prompt now

Open it inside the generator with the prompt pre-filled.

Try this prompt

Related prompts