Non-determinism in nowadays computing and IT.

MIPRO(2020)

引用 0|浏览11
暂无评分
摘要
Once-upon-a time computers and computations they delivered were considered ultimately deterministic. But currently we encounter random events, i.e. non-determinism in many areas of computation practice: non-determinism introduced by network latency, inherently non-deterministic computations with ‘big data’ and ‘deep learning’ where the results are probability distributions with errors, which also are probability distributions and both depend on initial selection of samples etc. For IT education one of the most disturbing sources of non-determinism and non-repeatability of previous examples comes from massive use of libraries and API-s (Application Programming Interface), which has made common the ‘top-down’ style of programming and negligence to practical issues - finiteness of computer memory and speed. Significance of the classical source of knowledge - printed hard-cover books - is diminishing, since by the time books are out of print there are already new versions of programs, new protocols, new technologies and libraries, and these new versions often do not work with old ones. The most relevant source of information has become Internet. But Internet is full of useless sources, since ‘Internet never forgets’ - together with sources describing latest program versions, libraries, technologies etc. there are still around tens of publications which use some by now already outdated program versions, libraries, technologies. Students, who are eager to perform well in the next recruiting interview are spending many evenings trying to swim in this swamp of non-deterministic mess, where most of presented examples are not repeatable. And the situation is becoming worse, since many of authors e.g. YouTube videos do not want to teach, but to earn using Google AdSense.
更多
查看译文
关键词
computation, determinism, networks, deep learning, big data, Internet
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要