Comment by btown
Comment by btown a day ago
It seems this is focused on on-device computation - as distinct from, say, Cloudflare's definition of the "edge" as a smart CDN with an ability to run arbitrary code and AI models in geographically distributed data centers (https://workers.cloudflare.com/).
Per Microsoft's definition in https://github.com/microsoft/edgeai-for-beginners/blob/main/...:
> EdgeAI represents a paradigm shift in artificial intelligence deployment, bringing AI capabilities directly to edge devices rather than relying solely on cloud-based processing. This approach enables AI models to run locally on devices with limited computational resources, providing real-time inference capabilities without requiring constant internet connectivity.
(This isn't necessarily just Microsoft's definition - https://www.redhat.com/en/topics/edge-computing/what-is-edge... from 2023 defines edge computing as on-device as well, and is cited in https://en.wikipedia.org/wiki/Edge_computing#cite_note-35)
I suppose that the definition "edge is anything except a central data center" is consistent between these two approaches, and there's overlap in needing reliable ways to deploy code to less-trusted/less-centrally-controlled environments... but it certainly muddies the techniques involved.
At this rate of term overloading, the next thing you know we'll be using the word "edgy" to describe teenagers or something...
I work at an industrial plant, we use "edge" to refer to something inside the production network.
As an example the control system network is air-gapped so to use ML for instrument control or similar the model needs to run on some type of "edge" compute device inside the production network all of the inferencing would need to happen locally (i.e. not in the cloud).