Document worth reading: “Graph Neural Processes: Towards Bayesian Graph Neural Networks”
We introduce Graph Neural Processes (GNP), impressed by the newest work in conditional and latent neural processes. A Graph Neural Process is printed as a Conditional Neural Process that operates on arbitrary graph data. It takes choices of sparsely seen context components as enter, and outputs a distribution over aim components. We exhibit graph neural processes in edge imputation and speak about benefits and drawbacks of the tactic for various software program areas. One most important benefit of GNPs is the facility to quantify uncertainty in deep learning on graph constructions. An further benefit of this system is the facility to extend graph neural networks to inputs of dynamic sized graphs. Graph Neural Processes: Towards Bayesian Graph Neural Networks